Hacker News new | past | comments | ask | show | jobs | submit login
The oldest "0" in India for which one can assign a definite date (ams.org)
119 points by richeyrw on May 24, 2013 | hide | past | favorite | 28 comments



An earlier use of zero (as defined in the article) for which one can assign a definite date was in 357 AD. The gotcha is that it's not the circle we're so familiar with. This is the first Mayan (Olmec) long-count recorded containing a zero, although long-counts showing a clear positional number system were recorded in the first century BC.

http://books.google.ca/books?id=TzrNgAsJY1MC&lpg=PP1&...

While the use of zero in India may have been influenced by earlier cultures (e.g. Babylon), it's pretty unlikely there was any direct contact that could have spread the use of zero to mesoamerica. Thus, positional number systems with a zero place-holder have been invented independenly at least twice in human history!


"New numbers" are invented to complete a group for a particular arithmetic operation. Zero is not necessary the addition operation because all cardinals one and above suffice. Zero is created for the subtraction operation of a cardinal with itself.

There were empty place holders signs in place-number systems of Babylonia and ancent China that function like our zero numeral. But this placeholder did not enter arithmetic operations. So it is not considered the "modern" zero.

Ditto creating negative numbers for cmpleting the subtraction operation group. Ratios for cardinal division. Real numbers for some ratios that are not cardinals. Infinity for zero divisors. Irrationals for roots of cardinals. Imaginaries for roots of negatives. Infinitesmals in differentiation and integration. And so on ...

At some point in medevial times each of these new numbers were considered the work of the Devil because they did not have concrete existance in the physical world. Later on accountants showed how zero & negative numbers were usefule for business. And physicists showed how these new numbers could predict the physical world.


>At some point in medevial times each of these new numbers were considered the work of the Devil because they did not have concrete existance in the physical world.

Really? Do you have a cite for this? It sounds made up.


I can't speak for math, but I know that a few centuries ago, certain musical notes were regarded as 'The Devil's Music':

https://en.wikipedia.org/wiki/Tritone#Historical_uses http://news.bbc.co.uk/2/hi/uk_news/magazine/4952646.stm


Well, both of those articles explain that this was just perceived as a dissonant interval which could be used a a musical representation of the Devil. The interval was likely avoided for essentially aesthetic musical reasons, and the Wikipedia article says that "...suggestions that singers were excommunicated or otherwise punished by the Church for invoking this interval are likely fanciful." Unfortunately the BBC article seems to be light on facts and heavy on stereotypes of medieval monks as ignorant religious fundamentalists.


Especially since under neoplatonic thought lots of things are considered more real because they lack physical existence.


I wonder if this is maybe a confused reference to the privation theory of evil? In Aristotelean scholastic theology evil is simply a lack of being (since goodness is one of the transcendental properties of being), and so there's a connection between nothingness and evil.


I think ignorance is probably a simpler explanation, but thanks for bringing that up, I am not well-versed in medieval theology.


A recent article of related interest is the story of the recovery of the oldest datable inscription that shows a numeral zero used in base-ten place-value notation, "How I Rediscovered the Oldest Zero in History,"

http://blogs.discovermagazine.com/crux/2013/05/20/how-i-redi...

which links to the article kindly submitted here.

The submission title here is NOT the original article title, as preferred by the Hacker News guidelines, and the other recent article suggests that the title shown on the submission here is not fully accurate for all regions of the world.


Apparently, this article is about oldest known depection of symbol of zero (i.e. a circle), which is found in Gwalior's Chatur Bhuja temple.

But, zero in decimal place-value system[1] and zero's usage as a number[2][3] are much older.

[1]http://en.wikipedia.org/wiki/Lokavibhaga

[2]http://en.wikipedia.org/wiki/Br%C4%81hmasphu%E1%B9%ADasiddh%...

[3]http://en.wikipedia.org/wiki/Brahmagupta#Zero


I'm not sure what the article claims here. This is the first dated inscription but the positional arabic notation (with a zero like glyph) had been in use for at least 300 years earlier. If you consider other place value systems the concept of a 'separator' goes back much farther.

This isn't necessarily evidence that they understood zero as a number as opposed to a concept. By 1000 in Europe they used a symbol as a placeholder that looked superficially similar to our modern glyph of zero. When explaining it, however, it was clear it was simply a placeholder that could always be omitted and was "sometimes useful" and was absolutely not really a number. It isn't clear that the glyph used here or in Europe was anything more than an elaborate 'dot.' It certainly didn't seem to be be transmitted as a recognizable glpyh but each culture rendered it in the way most convenient as a counting and placeholder aid.


Yeah;all this talk of 'placeholders' that don't matter. Every digit is a 'placeholder', standing in for the count that it represents. Not convinced.

If it was used in a power-of-ten (or power-of-anything for that matter) for 'nothing here' then it was a digit, it was zero, and it should be considered a legitimate heir to our zero.


Very interesting!

Of course, reading "Those who can access JSTOR can find some of the papers mentioned above there" demonstrates once more that, alas, the digital divide still exists.



OT: in Arabic Zero is "Ciphr" or "Cipher" where we get the words decipher or cipher from.


In many romance languages it is the root of the word we use for digit. It, es: cifra, fr: chiffre, ro: cifrǎ. In Portuguese the word is algarismo and it comes from al-Khwārizmī [0], who gives the name to algorithm as well.


In Tamil, its still called cyber.


So if 0 wasn't a concept, you couldn't count to 10?

I'm not sure I understand. I thought the significance of 0 was the recognition of an absence of existence. Just because numbers use 0 (270, 50, 20, etc.) wouldn't seem to indicate that it was understood at that significance...


It is two distinct concepts, both important. The one concept is the use of a positional or "place-value" notation for numbers.

Place-value notation means you use a base set of symbols, and then the position when using multiple digits is a multiplier. Compared to say Roman numerals this have the advantage that you can write arbitrarily large numbers. (The Romans didn't have a symbol for numbers larger than 1000, hence larger numbers became cumbersome to write.) More important though, arithmetic is much simpler using a place-value notion, because you can place numbers vertically and then perform the operation on one row at a time.

Place-value notation requires a symbol for "nothing at this place", the "0". The post is about the oldest known use of "0" in a place-value system. (Which probably means the oldest use of a place-value system per se, since you cannot have a place-value system without "0".)

However, using a place-value notation like decimal notation does not mean that you have to accept zero as a natural number. That is a separate issue.


The Babylonians got by using a place-system without zero for a long time. People relied on context to disambiguate.

(It probably helped that with a base-60 system, one was six times less likely to have a zero.)


The disambiguation from context was disambiguation of scale because there was no radix point. They did have a way to indicate a zero-valued position in a number, without that you can't really 'disambiguate' or have a positional number system at all.


No, the problem is not '0' not being a concept. There are two issues here, which are related but not the same. The conecpt of 'nothingness' is irrelevant to both:

1) Dealing with '0' as an ordinary number.

2) Using a positional number system, which requires a symbol or context for telling '1230' from '1023' and '1203', etc.

One tends to think that solving the first issue leads to the second. Solving the second implies in some way having already solved the first, due to the nature of positional systems.


> Solving the second implies in some way having already solved the first, due to the nature of positional systems.

Untrue. (Edit: actually true). It works perfectly well to have a decimal number system using digits 1-10 instead of 0-9. For sake of notation, let's say A is the digit representing 10.

"A" = 10, "11" = 11, "19" = 19, "1A" = 20, "21" = 21, etc.

Edit: actually, I misread your argument. What you argued is definitionally true.


would it be fair to differentiate the two concepts as:

1. using "0" as a digit

2. using "0" as a number


think roman numerals. or hash (tally) marks by five.


"The history of zero is a bit complicated."

I see what you did there


I see what you did there.


Also notable here is Aryabhatta who used a base of 10 symbol for zero[1]. Though he didn't use the notation '0'. [1]http://en.wikipedia.org/wiki/Aryabhata#Place_value_system_an...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: