Question:
Why can't you divide by zero?
2009-06-12 01:10:11 UTC
Hi guys, I'm really not Math strong but I am a very philosophical person. Why is it that we can't divide by zero? Zero is considered a number even though it doesn't exist. So, what would happen if you divided by 0? A calculator will produce an error message.

In other words, if you take 6 and multiply it by 0, why is the answer 0? I mean it's like saying "6(nothing)" so shouldn't the answer be 6 because nothing was divided? I hope I'm not too confusing but I am a foreign language/political science person but this has always bothered me that I can't come up with an answer. Thanks!!!
Seventeen answers:
Ms. Worth
2009-06-12 01:56:31 UTC
The REASON for the answers to division problems is the fact that these problems are "reversible."



That is, when you multiply the answer back against the divisor, you get the original number back again.



So the REASON that 10 divided by 2 = 5 is BECAUSE 5 times 2 is 10.



So if you are looking for an answer to "6 divided by 0," you have to produce an answer that will return 6 when it is multiplied by the divisor of 0.



So dividing 10 by 2 is really a form of solving

2x = 10



In like fashion dividing 6 by 0 is a form of solving

0x = 6



The problem arises when it becomes apparent that there is no possible solution for the term "zero times x = 6" because it means "What number times 0 will equal 6?"



In one sense, "infinity" is a kind of answer -- that's how much you'd have to multiply 0 by to bring back 6.



But it's not really true. Infinity times 0 is not really 6.



And even if it were, that would be the same answer for any number divided by 0, not just 6.



So it's hopeless to try to come up with an answer.



Beyond the philosophical considerations, division by zero can be a hidden factor in wrong solutions -- and not just in the realms of theoretical higher math.



There's a famous algebraic proof that 1 = 2 (but it depends on the unwitting step of dividing by a hidden zero). So in practice as well as in theory, division by zero makes even simple algebra come out wrong.

___________________

Here is the proof that one equals two.



Let a and b represent two numbers such that a equals b.

a = b



Multiply both sides by a

a² = ab



Subtract b² from both sides

a² - b² = ab - b²



Factor

( a + b )( a - b ) = b( a - b )



Divide both sides by ( a - b )

a + b = b



Substitute b in the place of its equal a

b + b = b



Combine like terms

2b = b



Divide both sides by b

2 = 1
?
2009-06-12 08:39:43 UTC
FIrst consider a simple example,

6/2=3. This means that 2*3=6.

So then,

6/0=x means that x*0=6.

But there is no number in our real number system which when multiplied to 0 gives 6, is there? Of course not. This is why we cant divide by zero, the answer is infinity.

Also note that,

As 1*0=0, 2*0=0, -1*0=0, 1/2=0, -1/7=0...

0/0=indeterminate (it can have infinitely many answers instead of one consistent answer.)



For a practical example, 6/2=3 means that if u have six apples and you divide all 6 of them among 2 friends, each gets 3 apples.

But 6/0 would mean you have to divide all 6 apples among 0 friends, which is simply not possible. You cant say each will have 0 apples, what would "each" mean then?? It is simply meaningless to divide any thing among zero people.



I am trying to explain the - "if u try to produce 6 nothings you get nothing, but if you have a 6 and multiply it by nothing, wouldn't u just have your 6 cuz u didn't change it?" part of the question.

Surely you know, 6+0=6, 6*0=0

Practically, when you eat 6 apples today and 0 apples tomorrow, you have eaten 6 apples, same as 6+0=6.

Now consider this, you eat 6 apples 0 times, in other words, if you never eat 6 apples, you eat a total of zero apples. That is why 6*0=0. If you had eaten 6 apples twice, you would have eaten 12 apples, that is why 6*2=12.

That is multiplication.
anonymous dude
2009-06-12 09:13:13 UTC
Division is defined to be a sort of inverse to multiplication. Given two numbers a and b, the quotient a/b is defined to be the unique number that when multiplied by b gives a. For example, to say that 6/2 = 3 means that 3 is the number which when multiplied by 2 gives 6, and to say that 2/(-4) = -1/2 is to say that -1/2 is the number which when multiplied by -4 gives 2. This notion of division is constructed to agree with our intuition that division should correspond to actually dividing numerical quantities (e.g. 6 apples divided into 2 piles gives 3 apples in each pile), but when this kind of intuition doesn't work (such as when we want to divide by negative numbers, 0, complex numbers, etc.) we need the abstract definition above to guide us.



So let's try to apply the definition to division by 0. 6/0 should be the unique number that when multiplied by 0 gives 6. But there is no such number: whenever you multiply a number by 0, you always get 0 and never 6. So the answer is that there is no answer: 6/0 just isn't a number. Something slightly different happens when you try to divide 0 by 0; the product of any number with 0 gives 0, and so the "uniqueness" part of the definition fails. Often we say that expressions like 1/0 and 6/0 are "undefined" because there is no possible answer and that the expression 0/0 is "indeterminate" because every number serves as a possible answer.



Maybe this is all very confusing, but actually it's easier if you think about it the way you would language. Language consists of a collection of objects (words) together with a collection of rules for combining them (grammar). In principle, any sentence constructed from the given objects using the given rules is a legal, grammatically correct sentence. Try to think of mathematics in the same way. The "words" are symbols like 1, 3, -5.2, +, =, etc. The "grammar" tells us how to organize these symbols into sentences; 6+2 = 3 is a grammatically correct mathematical sentence (even though it is false) while 3 + 1 - 2 is not (just the way the expression "the puppy" is not a complete sentence). From this point of view, "6/0" is a grammatically incorrect mathematical expression. This is the way mathematics works in general; often we work with very intuitive, hands-on objects like numbers and triangles, but we need very precise rules to guide us through situations that defy our intuition.
Puggy
2009-06-12 08:28:42 UTC
When division is explained at the elementary arithmetic level, it is often considered as a description of dividing a set of objects into equal parts. As an example, consider having 10 apples, and these apples are to be distributed equally to five people at a table. Each person would receive 10/5 = 2 apples. Similarly, if there are 10 apples, and only one person at the table, that person would receive 10/1 = 10 apples.



So for dividing by zero — what is the number of apples that each person receives when 10 apples are fairly distributed amongst 0 people? We can pinpoint certain words in the question to highlight the problem. The problem with this question is the "when". There is no way to distribute 10 apples amongst 0 people. In mathematical jargon, there can be no set of subsets of our set of 10 apples which has size 0 and forms a partition of our set. So10/0}, at least in elementary arithmetic, is said to be meaningless, or undefined.



In algebra, division by 0 causes problems. If we take the assumptions that

0 x 1 = 0, and

0 x 2 = 0, then it follows that



0 x 1 = 0 x 2.



If we divide both sides by 0, we get



(0/0) x 1 = (0/0) x 2



Giving us



1 = 2



Which is obviously false.



So, using some of the lower level maths, those are the reasons why we cannot divide by zero.



EDIT: If you take 6 and multiply it by 0, the answer is 0 but it cannot be explained to you without the concept of real analysis and axioms. Mainly, proof that

a(0) = 0



I used wikipedia because it gave the perfect example of what dividing by 0 means and why it doesn't make sense. However, you remain an ungrateful spoiled brat so poop on you :) For a person who has a "BA in International Relations and Spanish and speak 4 languages fluently", you sure are conceited.
james
2009-06-12 09:35:57 UTC
vey tricky question :). I'm not also sure with the reason but i'll try. I think zero (0) is indeed a number but it remains to be an insignificant value unless you put a number before it or if you put a decimal point before it and another number after it. So it would really be an error to divide a value by an insignificant value (makes sense?).



It's like you're giving a problem but it lacks information.



Example:



2 apples / 2 persons = 1 apple per person , if translated in sentence:

2 apples for 2 persons will give an (1) apple to each person.



0 apples / 2 persons = 0 apple per person, if translated:

No apple for 2 persons will result to (0) no apple for each person.



but

2 apples / no one = error, if translated: .....mmmm... see! i can't even think of how can i translate that in sentence (lol). 2 apples for no one would give you ... 2 apples for no one! As i said, it's like having a problem which lacks info.



You can divide nothing to everyone but you can not divide everything to no one. Stated differently, Everyone can have nothing, but no one can have everything (hope my philosophy is correct more than my math) lol)
Jon H
2009-06-12 08:16:08 UTC
You can't divide nothing. If you try to produce six nothings (6*0) then you still have nothing. If I give you half of nothing then you, in reality will receive nothing. However, logically there can't be half of nothing (0/2) hence the error message.



Also, attempting to divide by zero used to cause all sorts of fatal errors and infinite loops back in the early days of computing. That is why the saying is so prevalent.
odzookers
2009-06-12 08:37:46 UTC
You cannot divide or multiply by zero (or even add or subtract it) because it's not a quantity. It won't affect addition or subtraction because it has no substance--everything passes through it as if it's smoke. Multiplication and division (which is multiplication backwards) are different--as a factor, zero cancels all other quantities. Zero goes into every number no times at all, since it isn't a quantity; in multiplication, it renders any quantity null. Six times zero is zero because zero renders six immobile--it cannot DO anything. There are two or three books written on the subject of zero--Google them.
Al
2009-06-12 08:20:29 UTC
When you divide it's like taking that number and sharing it by the number your are dividing by. If you share by zero the calculator overflows because the actual answer will be infinity. It the other end of that argument of multiply by zero. One will give you nothing the other everything...

hope this helps
AntiApollyon
2009-06-12 08:19:25 UTC
Take any number and divide by that number, you get 1.



Example: 1/1 = 1



Now, keep making the divisor smaller:



1/.9 = 1.1111111

1/.8 = 1.25

.

.

.

1/.1 = 10

1/.09 = 11.111111

.

.

.

1/.000002 = 500000



Notice that the smaller the divisor gets, the larger the number gets?



As you get closer to zero, the number begins to get larger. What happens when you GET to zero? You never will.... for every number just greater than zero, there is another number between it and zero. As your divisor approaches zero, your answer will approach infinity, and infinity is not a number.



This is part of the limit theory or mathematics.
sabrina ♥
2009-06-13 16:07:28 UTC
well basically the way i do revision is like this



say if the sumn was



6 (divided by) 2 = 3



i would say theres 6 sweets and 2 people, so the 2 people get 3 sweets each



if the sum was



6 (divided by) 0



there would be six sweets, but no people, so you can't share them out :P
Mongolian Warrior
2009-06-12 08:16:02 UTC
Me, too. I like the fact that when you divide a number by itself, it results in 1, but 0/0=ERROR
mildepiphany
2009-06-12 14:23:26 UTC
You can approximate 6/0 and it would equal infinity which is not a number.
2009-06-12 10:08:01 UTC
x/0 does not exist, for all x.



Zero and the null set both do exist and they are not nothing.



Nothing does not exist.



If we define division thus: x/y =df (the z such that x=y*z), then we can prove that x/0 cannot exist, even though it is defined.



That is to say, x/0 = (the z such that x=0*z) but that z is not unique, therefore it cannot exist.
2009-06-12 08:26:12 UTC
As my professor says...



Divide By Zero and Go To Hell
Cherry
2009-06-12 08:14:23 UTC
Simply put, you can't divide by zero because the answer would be undefined. If you had four apples and you divided it with nothing, it would be, in a way, 'nothing', thus, undefined.
Maritova
2009-06-12 08:23:55 UTC
http://lmgtfy.com/?q=division+by+zero
2009-06-12 08:20:10 UTC
If you divide anything by zero, the answer will always be zero.


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...