FYI logo

So, why on earth can't you divide by zero?

A mathematical question

By Timothy J CarrierPublished 3 years ago 7 min read

Once upon a time, if you asked Siri on your Apple phone, "What does zero divided by zero equal?", it would say.

"If you have zero cookies and you want to give them to zero friends, how many can each one get? You see, this question doesn't make any sense, right? The sugar cookie monster will be sad because there are no cookies to eat, and you will be sad because you don't have a single friend."

Putting aside this hurtful answer (who would talk to you if you had friends? Hey!) ), dividing by zero is indeed a problem that plagues many people. Ten divided by two equals five, six divided by three equals two, and one divided by zero is what? Elementary school math will tell you that the answer is not divisible. But why? Zero is also a number, what is so special about it?

01 Primary School

In elementary school arithmetic, the problem was very simple. Back then, we defined division as "dividing something into parts", and it was easy to imagine dividing into one, two, three, four, five, six, or seven parts, but how would you divide 10 cookies among 0 people? Can't imagine it! So you can't divide.

Astute students may think, if 0 cookies to 0 people if there was nothing, it seems to be okay. But since there is nothing and no one, how much each person gets is possible, it is impossible to give a single definite value.

This conclusion is correct, but this is all something that is obtained by intuition. Just because you can't imagine it, doesn't necessarily mean it's not there. Mathematics in ancient times was based on intuition, and grocery shopping was enough, but to go further, definitions and proofs were necessary - so, we went to middle school.

02 Junior High School

We are now introduced to algebra at its most basic level - that is, solving equations. We find that division and multiplication are inverse operations of each other, so asking 1 / 0 =?

is equivalent to solving the equation 0 * x = 1

Well, by definition, 0 multiplied by any number is 0 and cannot equal 1, so a number satisfying x does not exist, so it cannot be divided.

Similarly, if you ask 0 / 0 =?

it is equivalent to solving the equation 0 * x = 0

Similarly, any number can satisfy x, so it cannot be divided - no single answer can be determined.

03 High School

By the time we are exposed to basic formal logic, we find another type of proof: the converse method.

A bunch of true statements cannot lead to a false one, so if we use "can divide normally by zero" along with a bunch of other true statements and end up with a false one, it only means that the "divide by zero" thing is not true.

So, we know that 0 * 1 = 0

0 * 2 = 0

we get 0 * 1 = 0 * 2

Dividing both sides by zero gives us ( 0 / 0 ) * 1 = ( 0 / 0 ) * 2

Simplify to get 1 = 2.

This is wrong.

So, the problem is solved! It's not. Consider another question: What is the square root of -1?

You might say that -1 can't be squared because the square of all numbers is non-negative. But this is talking about real numbers, what if I add a definition? Define i^2 = -1, which creates imaginary numbers, so -1 can be squared.

So why not define a "new" number so that 1 / 0 is also equal to it, and set up a set of operators for this number? That would have to be answered by the university.

04 The Big One

When you first take a calculus course, you immediately come across the symbol ∞. Well, that's "infinity". We've all learned the concept of limit, so I'll make b tend to 0 and define the limit of a/b as infinity, right?

This immediately encounters a problem, its left limit and right limit are not the same ah. b is from the negative end of the approach to 0, or the positive end? One is getting negative and the other is getting positive, and they don't meet. It is impossible to define such a limit.

Therefore, it is repeatedly said in calculus courses that although the symbol ∞ is used, it only represents a trend, not a real number, and cannot be used in calculus.

05 Sophomore Year

So, having learned my lesson, I will not use the ready-made symbols, I will just define 1 / 0 = w, w as an "infinite" number, not touching any limit, you will have nothing to say!

However, the definition does not just come, you can define things, but if the definition and other existing systems contradict, it will not work, or be very bad.

And we face w immediately encountered problems. First of all, how does w fit into the basic system of addition, subtraction, multiplication, and division, how much does 1 + w equal, and how much does w - w equal? If you make a number and you can't even add, subtract, multiply or divide it, it's not very useful, is it?

For example, intuitively, 1 + w should equal w, it's infinite! And w - w is equal to 0. It subtracts itself!

But this immediately contradicts the very important "law of union" in addition: 1 + ( w - w ) = 1 + 0 = 1, but ( 1 + w ) - w = w - w = 0. The law of union is a very basic thing in addition, and it costs a lot to not even have the law of union for a w - not only the law of union itself but also the law of union. -It is not only the law of union itself, but also the number of mathematical theorems used unconsciously in the process of proving it, so if we throw it away, we have to start over and build a new system. The new system is not impossible to build, but it is laborious and (temporarily) useless, so we are still using the old one honestly - and the old one, to preserve the law of union, can not play so.

Readers are welcome to use their imagination and try to give arithmetic for w. But you'll find that no matter how you specify the relationship between w and other numbers, as long as you still insist that 1 / 0 = w, you can't make it work without contradicting the basic math you grew up learning. You can build your new math on top of w, but it is incompatible with most traditional math, and it will certainly not work very well, so it makes sense that we use a system that does not divide by zero.

06 The Big Three

You might object: there are so many ways to define this, and I've tried them all. If I haven't tried them, how do I know one won't pop up one day that is self-consistent?

There are things like "new findings overturning old conclusions" in biology, chemistry, and physics, but not in mathematics. Because mathematics is based on logic, there are exceptions in individual cases and no exceptions in logic. Of course, our mathematics has not yet completed its final axiomatization and has to face Gödel's ghost, but at least in this case, if w is a real number, then it violates some very important axioms, and the status of these axioms is very deep.

For example, there is a set of fundamental axioms called "Peano's axioms", one of which says that every definite natural number has a definite successor, and a successor is also a natural number; another says that the natural number b = c when and only when the successor of b = the successor of c. So whose successor is w?

Then whose successor is w - or rather, who plus 1 gets w? All other numbers already have their successors, w has no place in them, and no other number plus 1 can become w. Then it would have to be 1 + w = w, but that would be a direct contradiction to the second sentence. And without Peano's axioms, the whole system of natural numbers cannot hold.

Assume here that w is a natural number. Other cases will be slightly more complicated, but in any case, similar things happen in various definitions of w. If you want to think of a number, it's not compatible with our existing real numbers. So we can only declare it in almost all cases, and we can't divide by 0.

07 Above the senior level

Since we said "almost" before, there are exceptions - in some odd cases, yes.

For example, there is something called "complex infinity", which is a point on the expanded complex plane, and it is a defined point. Under this particular rule, you can write down an expression like 1 / 0 = ∞. The reason for this is a long story, but it is not an operation in the usual sense - for example, you cannot take back 0 and write 1 = 0 * ∞.

Also, the word "infinity" can be treated as a "thing" in some other contexts. For example, when you measure the size of a set, it can be infinite. But there are many different kinds of infinities - natural numbers are infinite, rational numbers are infinite, real numbers are infinite, but odd and even and positive and negative integers and natural and rational numbers are all equally numerous, while real numbers are more numerous than all of them! Also infinite, some infinities are more infinite than others. But that's a topic for another day, so let's stop.

08 Conclusion

So, when we say we can't divide by zero, the reason ...... is surprisingly good. There are many intuitions that are disproved in mathematics, but not this one. We have all sorts of mathematical ways to justify why it doesn't hold up, and while it may not sound as heart-warming (or heart-chilling) as Siri's answer, there's something beautiful about these rational pleasures, isn't there?

Science

About the Creator

Timothy J Carrier

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2026 Creatd, Inc. All Rights Reserved.