Mathematical Times


Volume 0, Issue 0

About Zero

It took humanity at least 3000 years to reach its current level of mathematical understanding.  An essential part of that knowledge is that counting normally starts with nothing. Zero cometh before one.

Is zero even?

Yes it is. Absolutely no mathematically literate person is questioning that fact. There are several ways to look at it.  Either you define even numbers as multiples of two and zero is 0 times 2; it's a muliple of 2, just like it is a multiple of 3, 4, 5, etc. Or you may observe that every other number is even and every other number is odd.  This extends without any problem to negative numbers as well and zero must be considered even in order not to break the pattern:  -5, -3, -1, 1, 3, 5, 7, 9 are odd, whereas -6, -4, -2, 0, 2, 4, 6, 8 are even.  Now, a surprising number of people will somehow find this fact disturbing and hold on to the backward notion that 0 is such a "special" number that it cannot be treated like any other.  This includes, unfortunately, a few people who teach or have been teaching mathematics. 

When does a new century begin?

This is also a settled issue.  However, each time this happens in history, you will find the majority of people taking the wrong side of the issue and "celebrate" the new century a year too early.  The 21st century and the 3rd millenium (not the second millenium, please!) started at the beginning of January 1, 2001.  It took essentially a whole year to convince the majority of people of that fact.

Zero to the power of zero: 00=1.

Well, zero to the zeroth power is actually equal to one.  Like it or not.  This one is constantly used by professional mathematicians, even though most will tell you that zero to the power of zero is undefined.

Zero has zero significant digits.

Leading zeros are ignored and cannot count as significant digits.  The strings 007, 07 and 7 denote the same integer  (seven).  So do the strings 000, 00, 0 and the empty string.  You may not be able to write down the empty string "" without quotes and that's why "0" is used instead to denote the integer zero.  However, when it comes to determining the length of the simplest decimal representation of an integer, the empty string comes back into play:  The decimal length of zero is zero!

Zero is divisible into anthing but divides only itself.

Zero is divisible into anything  but it divides  only  itself!  The proper notion of  divisibility  makes interesting statements meaningful and correct:
Two integers are distinct if and only if there is an integer which only one of them divides.
 
Zero is the only element which zero divides.

The Mathematical Universe:
A Structure Built on Nothing

Mathematics is first and foremost a language.  It's a human language. A set of conventions made over the centuries to allow things to be recorded and communicated by human beings.  Like all languages it has a syntax (which specifies how its symbols may be put together) and semantics. Semantics is the term used by Computer Scientists to describe the meaning of certain sentences whose syntax happens to be correct (not all syntactically correct sentences are meaningful).  Such a meaning is usually described in term of some "real" world.  What makes mathematics unique among all possible languages is that the universe in terms of which a meaning is assigned to mathematical statements is not "real" at all.  The mathematical discourse is about mathematics itself.

Anything Is True of Nothing

This one may be difficult to swallow at first, but logic itself just would not be consistent without the convention that any statement which does not apply to anything is "vacuously" true.  The negation of such a statement is also vacuously true.  In fact, if some statement and its negation are proven to be both true of all things endowed with a given property, it is hereby shown that absolutely nothing may have that property.  Clear enough, isn't it?
visits since November 3, 2004 Valid HTML
 (c) Copyright 2000-2004, Gerard P. Michon, Ph.D.