Lets use two infinity numbers, a and b.
a = \(\left (1 \over 0 \right ) + 1\)
b = \(1 \over 0 \)
Now we subtract them, assuming \(\left ( 1 \over 0 \right ) = ∞ \)
and \(∞ - ∞ = 0\)
\(b - a = \left ( ∞ + 1 \right ) - \left (∞ \right ) \\ = 1 \)
\(a - a = ∞ - ∞ \\ = 0\)
Simplifying yields
\(∞ - ∞ = 1 \\ ∞ - ∞ = 0 \\ 1 = 0 \\ False\)
Hence we get \(∞ \over ∞ \\ ≠ 1\) but \(∞ \over ∞ \\ = undefined\)
Do you think this proof is valid?
\(\dfrac 1 0 \text{ is undefined.}\)
\(\infty - \infty \text{ is meaningless, you can manipulate this to be any value you like.}\\ \\ \text{It certainly doesn't equal 0.}\)
\(\text{The whole "proof" is nonsense.}\)
.Well, infinity plus infinity(if you were associating infinity as a number) would make a bigger number altogether, therefore making infinity indefinable. Same goes with multiplication. Everything divided by itself equals 1 so it is definable. Same with subtraction which equals 0. For the last two they could be indefinable as infinity is forever being bigger.
Well, infinity plus infinity(if you were associating infinity as a number) would make a bigger number altogether,
No.
The set of symbols
\(\infty + \infty > \infty\)
is meaningless. You cannot do arithmetic with infinity.