Let $n$ be a positive integer, $n!$ denotes the factorial of $n$. Let $d = \gcd(n! + 1, (n + 1)! + 1)$. Show that $d$ divides $n$. (Hint: notice that $(n+1)(n!+1) = (n+1)!+n+1$)
How to show that $\gcd(n! + 1, (n + 1)! + 1) \mid n$?
2
$\begingroup$
elementary-number-theory
divisibility
factorial
greatest-common-divisor
-
4Please write the question in the question and not in the title. – 2010-10-09
-
0Hmm, what does the question have to do with linear algebra? – 2010-10-10
-
0@Rasmus: There is innate linearity in that a set of common multiples is closed under integral linear combinations - which is the key to this problem. This leads to the notion of an ideal (R-module) in a ring as a generalization of a set of common multiples. Because of this, linear algebra (module theory) plays a big role in number theory. – 2010-10-10
-
0@Bill: Interesting, thanks! – 2010-10-10
-
0See also [Proof that (n!+1,(n+1)!+1)=1](http://math.stackexchange.com/q/25688). – 2016-10-22