The answer is that the limit of 0/0 in the negative directions trends to negative infinity, and in the positive direction trends to positive infinity
\(\lim_{x\rightarrow infinity } 0/0=+infinity\)
\(\lim_{x\rightarrow -infinity } 0/0=-infinity\)
You can't have a single answer because it depends wheter you aproch it from the negative or positive side
EDIT:
\(x = \lim_{x\rightarrow +0} \frac{1}{x}=+infinity\)
\(x = \lim_{x\rightarrow -0} \frac{1}{x}=-infinity\)
Thanks for all the answers! (even the mean ones)
I now realize this is not a actual answer to the question. But it still sheeds a bit of light on why you can't (thecnicaly) divide by cero in linear algebra.
Kudos to CPhill for giving the actual answer