• Question: What is the logical answer to divide by 0, why it doesn’t work and what is 0^0?

    Asked by zakmohammed on 28 Jan 2020.
    • Photo: Maja Popovic

      Maja Popovic answered on 28 Jan 2020:


      0^0 is actually not really defined
      sometimes it’s agreed that this is 1

      logic behind dividing by 0:
      if you divide a number by 1 the result is the original number
      if you divide a number by a divisor larger than 1, the result will be smaller than the original number
      if you divide it by a divisor smaller than 1, the result will be larger than the original number
      the smaller the divisor<1 is, the larger is the result
      when the divisor becomes 0, the result becomes infinity

    • Photo: Giuseppe Cotugno

      Giuseppe Cotugno answered on 28 Jan 2020:


      Division by 0 is undefined because it is not really a division that makes sense. You are dividing something by nothing hence you are not dividing at all!

      In mathematics divisions by zero are producing the infinite. Some mathematicians state that if a quantity is divided by nothing you have an infinity because that number would never decrease. Other mathematicians would tell you instead that the result is just undefined.

      In practice, mathematics always take precautions not to end up in situations were you divide by 0.

      This is especially important if calculations are done with a computer. Depending on how the program was compiled (i.e. how the source code has been assembled into a working software) you could get several behaviours. A division by 0 in a computer could produce a Not A Number (NaN) value which will basically absorb any other number it is added/subtracted/divided/multiplied (an operation involving NaN will always produce NaN). In other cases a division by 0 might crash the program altogether to prevent having NaNs devastating every mathematical operation they involved in.

    • Photo: Daniel Bearup

      Daniel Bearup answered on 29 Jan 2020: last edited 29 Jan 2020 10:53 am


      Another way mathematicians think about these issues is to consider what happens when numbers get closer to zero. If we consider 1/0.1 we get 10. 1/0.01 gets us 100. 1/0.001 gets us 1000. We can say that as x gets closer to zero, 1/x increases. Specifically, we can show that for any number y, we can find an x so that 1/x > y. If we make this argument a little more formal, we arrive at the mathematical definition of infinity. A complication is that if x < 0, we get very big negative numbers instead … .

      If we try the same thing with x^x, 0.1^0.1 = 0.79, 0.01^0.01 = 0.95, 0.001^0.001 = 0.99 and so on. We can figure out that x^x approaches 1 as x goes towards zero (here is doesn't matter if x is positive or negative). We would say the limit of x^x as x goes to 0 is 1. (We don't say this for 1/x because infinity is not a number, we would say 1/x diverges, which means approaches positive or negative infinity, as x goes to 0.)

Comments