+0  
 
0
438
1
avatar

1: prove that two nonzero (integer) perfect squares cannot differ by 1.

 Dec 27, 2021
 #1
avatar+483 
+10

First, we can prove that in order to make the nonzero difference between two perfect squares as small as possible, their square roots must differ by 1. We can prove this as follows:

 

Let the two squares be \(a^2\) and \(b^2\). Without loss of generality, let us assume \(b \geq a, \) so we can write \(b \) as \(a+k.\) Now, to compute the difference, we get \(b^2-a^2=(a+k)^2-a^2=a^2+2ak+k^2-a^2=2ak+k^2.\) Obviously, in order to minimize the nonzero difference, \(k \neq 0, \) and \(k \) must be as small as possible.

 

Because we want \(2ak+k^2 \) to be 1, we can factor this as \(k(2a+k)\) to see that either \(k=2a+k=-1,\) or \(k=2a+k=1.\) 

 

If  \(k=2a+k=-1,\)  \(2a-1=-1,\) so \(a=0,\) which is a contradiction.

If \(k=2a+k=1,\) \(2a+1=1,\) so \(a=0,\) which is a contradiction.

 

Thus, two nonzero squares \(a^2\) and \(b^2\) cannot differ by 1.

 Dec 27, 2021

2 Online Users

avatar