Heuristically, I suppose we could see this being because the standard action of the complex numbers on $V = \mathbb R^{2n}$ is by rotation. That is, $(e_1, \ldots, e_{2n})$ is a basis for $V$, then we define multiplication by $i$ as
$ i e_{2k-1} = e_{2k}, \quad i e_{2k} = - e_{2k-1},$ for $k = 1, \ldots, n$,
so multiplying a vector $v$ by a complex number $\lambda$ will correspond to a scaling by a real and a rotation.
Now, if we have a rotation $A$ on the space $V$ and we want to find a line $l$ "invariant" under $A$, then we can try to look for a complex number $\lambda$ such that rotation of $l$ under $A$ is equivalent to the action of $\lambda$ on $l$. Thus, we can try and look for complex eigenvalues $\lambda$ of $A$.
This line of "reasoning" completely breaks down in the odd-dimensional case, because we can't define a complex structure on odd-dimensional spaces, but it might give a hint into why we'd look for complex eigenvalues at all. Then it becomes a matter of algebra to figure out that it actually works, and that it does so in a vector space of any finite dimension.
Finally, for the existence, as Alex already pointed out, we look for eigenvalues by finding roots of polynomials. All polynomials admit a root over the complex numbers, which translates into the existence of a complex eigenvalue.