In ordinary arithmetic, the expression has no meaning, as there is no number which, multiplied by 0, gives a (assuming a≠0), and so division by zero is undefined. Since any number multiplied by zero is zero, the expression 0/0 also has no defined value; when it is the form of a limit, it is an indeterminate form.
or a much harder way to explain(With proof):
0/0=1/0∗0=00/0=1/0∗0=0 (because no matter what 1/0 is, none of it is nothing - logically, and logic is the foundation of mathematics)
Say you have zero cookies, and you split them among zero friends. How many do each of the friends get? Zero. (Well this seems obvious to me, but maybe not to others. However, my first proof still works.)
A proof that zero divided by zero is indeterminate is commonly given. However, I can explain why this is wrong.
Common proof: “ 0∗1=00∗1=0 → 1=0/01=0/0. but 0∗2=00∗2=0 → 2=0/02=0/0 and then 2=12=1. so 0/0=0/0=indeterminate.”
The error was going from 0a=00a=0 to a=0/0a=0/0. Because:
a∗0/0=0/0a∗0/0=0/0 - this step is commonly ignored. However, this step is essential.
It is commonly assumed that the first 0/00/0 cancels out. However, x/xx/x can only cancel out if x/x=1x/x=1. Therefore, going to a=0/0a=0/0 from this requires 0/0=10/0=1, and that a) assumes and b) contradicts the conclusion.
We don't know... some people says it's 0, 1 or undefined...