I'm trying to understand the intuitions behind working with very large numbers. Specifically, I'm talking about numbers of the form $a^b$ where $a > 10,000$ and $b > 10,000$, and in general $a$ and $b$ are small, but $a^b$ has millions of digits. Obviously, computing the number is infeasible, but I still would like to determine things about them.
I am particularly interested in ideas or rules about evaluating which of two large numbers is larger. I believe I want to use logarithms somehow, but they've always confused me a little bit, so I can't quite see where to go from there.
More generally, I'd love to hear more generic discussion of how one can work with such large numbers easily, sacrificing precision, but not accuracy.