0
$\begingroup$

Note: logs below are base 2. (Not sure how to do subscripts here)

Wondering if the below equation is true when thinking asymptotically (Computer Science)

$log_2((n!)^n) = O(n \sin(n \frac{\pi}{2}) + \log_2{n})$

But I'm not sure how to compute this.

I'm guessing we need to take the log of both sides of the following equation:

$log_2((n!)^n) < n sin(n (pi/2)) + log_2(n)$

getting us:

$log_2(log_2((n!)^n)) < log_2 (n sin(n (pi/2)) + log_2(n) )$

Not sure where to go from there.

  • 1
    You should know that all log have the same big O limit, as they differ by a constant. Secondly, an underscore (within math mode, that is $ signs) will get you to subscript.2010-09-08
  • 2
    Also, I'd try and use Stirling's approximation for $n!$ and the fact that $log(n!^n) = nlog(n!)$, and try to work it from there. Or something like that.2010-09-08
  • 0
    Thanks, I tried to make it look nicer with your suggestions...hope it helps the readability.2010-09-08

1 Answers 1

1

If you are interested in computer science, a very useful thing to remember is that

$$ \log n! = \theta(n \log n) $$

Now given this, can you tell what $f(n)$ is, if $ \log (n!^{n}) = \theta(f(n))$?

How would that compare to the right hand side?

  • 0
    Theta(n log(n!)) -- but in my equation we need to take the log of that again, which gives us log(n log(n!))...any tips on where to go from there?2010-09-08
  • 0
    @Sev: Why do you need to take log again? what is $\log (a^n)$?2010-09-08
  • 0
    because originally it was log (log((n!)^n)) -- now i see that log(n!^n) is n log n! but what about the outer log?2010-09-08
  • 0
    @Sev: If log (n!) ~ nlogn, what is nlog(n!)? What is the log of that?2010-09-08
  • 0
    @Moron: nlog(n!) must be n* nlogn which is n^2 log n?2010-09-08
  • 0
    @Sev: Right, What is log (n^2 logn) then?2010-09-08
  • 0
    @Moron: I guess 2 * log n + log(log n) -- not sure how to simplify the second term2010-09-08
  • 0
    @Sev: You can't simplify the second term, but you can ignore it. Why?2010-09-08
  • 0
    @Moron: Ah ha! Because the first term dominates it. So this helped me simplify the left side down to 2 log n, so now I have to simplify the right side (which has been done before and it is O(n))...so I'm guessing the equation is true...am I right?2010-09-08
  • 0
    What confuses me about the right side is that it oscillates due to the sin function combined with odd/even numbers. Do we take the MAX value it can have, and that's why it is O(n) and that's all we look at ? If so, I'm guessing it's because O is the upper bound, so it is the highest it can the function on the left can be. Let me know if I'm right please.2010-09-08
  • 0
    @Sev: Right about dominance. The right side is O(n), but that is an upper bound. That does not help you prove anything about the left side. For example log(logn) is O(n) too (as that is an upper bound), but it is not true that 2logn = O(log (logn)). To prove the equation, you need to show that there are constants C and M such that 2logn < C* (n sin(n*pi/2) + logn) for all n > M.2010-09-08
  • 0
    I would guess there is no such constant that would do that, since sometimes the right hand side will be negative, and thus less than 2 logn. True?2010-09-08
  • 0
    @Sev: Depends on your definition of BigOH. Instead of f(n), people consider the absolute value: |f(n)|. Usually in computer science, we expect the functions to be non-negative, as having negative running times make no sense. Mathematically speaking, you need to consider absolute values. So you have to show |2logn| < C*|nsin(npi/2) + logn| for all n > M.2010-09-08
  • 0
    The fact that absolute value of the sin functions will oscillate down to 0, therefore we can ignore it and just look at the log n on the right side. we can choose C = 3 and our right side will be 3*logn vs. left will be 2logn...making the statement false?2010-09-08
  • 0
    @Sev: n*sin(npi/2) can be one of three, 0, n or -n. So you can't ignore that. Choosing C=3 actually proves that the statement is true. Remember, we have been talking about BigOh all the while, where 2n is same as 1000000n (i.e. constants don't matter). That is one of the reasons why the base of the logarithm is irrelevant too.2010-09-08
  • 0
    So the statement is true if n*sin(npi/2) is 0, and C = 3, I can see that...but I don't see how it's true if n*sin(npi/2) is -n and we choose C = 3? That would make the left side 2logn and the right side -n + log n -- how is that greater than 2logn?2010-09-08
  • 0
    Oooh, absolute value..right. So this statement is true.2010-09-08
  • 0
    So it seems the course I'm following states I shouldn't use absolute values here...weird.2010-09-09
  • 0
    @Sev: Yeah, unfortunately there could be different definitions of BigOh out there!2010-09-09