I was reading over this in my notes, but it doesnt make any sense to me:
However, suppose we fix an alphabet for writing our programs (e.g. 8- bit ASCII). Since each individual program is finite in length, we can put all possible programs into a (very long) ordered list. For any fixed character length k, there are only a finite set of possible programs. So, we can write down all programs by first first writing down all the 1-character programs, then all the 2-character programs, and so forth. In other words, there’s a bijection between the integers and the total set of programs. But this means that the number of functions is uncountable, whereas the number of programs is only countably infinite. So there must be mathematical functions that we can’t compute with any (finite-length) program.
What exactly is this saying? Why does it hold? It seems like it shouldn't. In some cases a program of only 30-40 characters can compute billions of numbers! Maybe with current hardware it may not be possible to computer some numbers, but theoretically, this shouldn't be a problem, right?