We use cookies to personalise content and advertisements and to analyse access to our website. Furthermore, our partners for online advertising receive pseudonymised information about your use of our website.
Please click on "Accept cookies" if you agree to the setting of cookies. Cookies that do not require consent remain unaffected by this, see
cookie policy and privacy policy.
DECLINE COOKIES

I found myself thinking about this for a bit. If you have any equation x=y, then multiplying both sides by 0 gives 0=0, which is true so x=y. That's obviously wrong, but I can't seem to find a concrete reason why. Could someone explain? (e.x. 1=2, multiplying both sides by 0 gives 0=0, so 1=2.)

Guest Nov 8, 2018

edited by
Guest
Nov 8, 2018

#1**+1 **

Maybe because multiplying by 0 is not reversible, if you divide by 0 it is undefined.

Guest Nov 8, 2018

#2**+1 **

\(\text{to expand on this a tiny bit suppose we start with}\\ 0\cdot x = 0\cdot y\\ \text{being specific about the algebraic step you mention we multiply both sides }\\ 0^{-1} \cdot 0 \cdot x = 0^{-1}\cdot 0 \cdot y \Rightarrow 1\cdot x = 1\cdot y \Rightarrow x = y\\ \text{where }0^{-1} \text{ is the multiplicative inverse of 0}\\ \text{but as Guest pointed out }0^{-1} \text{ does not exist, and thus the fallacy}\)

Rom
Nov 8, 2018

#3**0 **

I think we will all agree that the initial assumption of 1= 2 is untrue.

While it IS true that 1 * 0 = 2 *0 you would have to eliminate the ' 0 ' on both sides of the equation to prove 1=2....to do that would require that you DIVIDE BOTH SIDES OF THE EQUATION BY ZERO....and we all know that DIVISION BY ZERO IS NOT ALLOWED/DEFINED.

ElectricPavlov Nov 8, 2018