+0  
 
0
109
4
avatar

I found myself thinking about this for a bit. If you have any equation x=y, then multiplying both sides by 0 gives 0=0, which is true so x=y. That's obviously wrong, but I can't seem to find a concrete reason why. Could someone explain? (e.x. 1=2, multiplying both sides by 0 gives 0=0, so 1=2.)

 Nov 8, 2018
edited by Guest  Nov 8, 2018
 #1
avatar
+1

Maybe because multiplying by 0 is not reversible, if you divide by 0 it is undefined.

 Nov 8, 2018
 #2
avatar+4442 
+1

\(\text{to expand on this a tiny bit suppose we start with}\\ 0\cdot x = 0\cdot y\\ \text{being specific about the algebraic step you mention we multiply both sides }\\ 0^{-1} \cdot 0 \cdot x = 0^{-1}\cdot 0 \cdot y \Rightarrow 1\cdot x = 1\cdot y \Rightarrow x = y\\ \text{where }0^{-1} \text{ is the multiplicative inverse of 0}\\ \text{but as Guest pointed out }0^{-1} \text{ does not exist, and thus the fallacy}\)

Rom  Nov 8, 2018
 #3
avatar+17331 
0

I think we will all agree that the initial assumption of 1= 2  is untrue.

While it IS true that  1 * 0 = 2 *0    you would have to eliminate the ' 0 ' on both sides of the equation to prove 1=2....to do that would require that you DIVIDE BOTH SIDES OF THE EQUATION BY ZERO....and we all know that DIVISION BY ZERO IS NOT ALLOWED/DEFINED.

 Nov 8, 2018
 #4
avatar
0

I see now, thanks!

 Nov 9, 2018

8 Online Users

avatar
avatar
avatar