Recently I found this post on Reddit. It describes the following algorithm to find e:
Here is an example of e turning up unexpectedly. Select a random number between 0 and 1. Now select another and add it to the first. Keep doing this, piling on random numbers. How many random numbers, on average, do you need to make the total greater than 1? The answer is e.
This means that you need on average ~2.7 random real numbers to make the sum greater than 1.
However, a random number between 0 and 1 would on average be equal to 0.5. So intuitively I would think that, on average, only 2 random numbers would be required to have a sum > 1.
So where did I go wrong in my thinking?
Update
I just figured it out myself: You need at least two numbers to have a sum > 1, but often you'll need three, sometimes you'll need four, sometimes five, etc... So it only natural that the average required numbers is above 2.
Thanks for the replies!