https://newatlas.com/science/decimal-point-history-older/

Historians have discovered what may be the world's first decimal point, in an ancient manuscript written 150 years before its next known appearance. There have been many ways to split integers, but this little dot has proven uniquely powerful.

The mathematics we all learn at school seems so fundamental that it doesn't feel like individual concepts in it would need "inventing," but these pieces arose separately as scientists and mathematicians realized they were needed. For instance, scientists recently found the oldest written record of the numeral "0," dating back 500 years earlier than previously thought.

Now, it looks like the decimal point is also older than expected. Ever since we've realized we sometimes need to break numbers into smaller fragments, humans have denoted the difference using various symbols – dashes, vertical lines, arcs and underscores have filled the role, but none of those have survived into modern usage. Commas and periods are the most common now, so when did they start?

Previously, the earliest known use of a period as a decimal point was thought to be an astronomical table by the German mathematician Christopher Clavius in 1593. But according to modern scientists, that kind of test is a weird place to introduce such a massive concept to the world, and Clavius didn't really go on to use the idea much in his later writings. Basically, if he realized the need for the concept and invented a neat way to display and work with it, why didn't he brag about it?

The answer, it seems, is that Clavius was just borrowing an older idea that had essentially been lost to time, and wasn't the preferred method in his era. A new study has found that the decimal point dates back to the 1440s – about 150 years earlier – first appearing in the writings of Italian mathematician Giovanni Bianchini.

Bianchini was a professor of mathematics and astronomy at the University of Ferrara, but he also had a background in what we'd now call finance – he was a merchant, and managed assets and investments for a wealthy ruling family of the time. That real-world experience seems to have influenced his mathematical work, since Bianchini was known to have created his own system of dividing measurement units like feet into 10 equal parts to make them easier to work with. As fundamental as it feels to modern sensibilities, it didn't catch on with the 15th century crowd who were used to a base-60 system.

Now, Dr. Glen Van Brummelen, a professor at Trinity Western University in Canada, has discovered that Bianchini illustrated this system with a decimal point, the first ever. Van Brummelen found that in a manuscript called Tabulae primi mobilis B, Bianchini was using numbers with dots in the middle – the first one being 10.4 – and showing how to multiply them, something that was tricky in a base-60 system.

"I realized that he's using this just as we do, and he knows how to do calculations with it," Van Brummelen told Nature. "I remember running up and down the hallways of the dorm with my computer trying to find anybody who was awake, shouting 'look at this, this guy is doing decimal points in the 1440s!'"

**Journal Reference:**

Glen Van Brummelen, **Decimal fractional numeration and the decimal point in 15th-century Italy**, *Historica Mathematica*, In Press, 2024. https://doi.org/10.1016/j.hm.2024.01.001

**7**comments | Search Discussion

**The Fine Print:**The following comments are owned by whoever posted them. We are not responsible for them in any way.

**(1)**

**(1)**

## (Score: 2, Insightful) by Mojibake Tengu on Monday February 26 2024, @02:14AM (3 children)

Decimal notation is broken. Completely.

What Ancient Greeks called Πραγματικός αριθμός [pragmatikos arithmos], literally "practical numbers"

^{[1]}is today calledReal numbers. That's totally absurd.Ancient Greeks explicitly did understood those

pragmatikos arithmosdo not exist as real entities, for it is impossible to measure them exactly in real world, but they are useful for practical computations (like square roots or volumes), as abstractions.Inversion of this initial understanding, enforcing what's not real as real in minds of educated people, is greatest deliberate illusion error of all history made by Cultists of the West.

Example how decimal notation is broken:

Consider these two generally accepted equivalent representations of 1:

0.999999999999....

1.000000000000....

Beside infinite space representation of data, on αυτόματο, the exact compare operation is infinite time complexity...

Today, all programmers and subsequently users suffer from that. Yes, there are logical workarounds, like convergence, set theory, various infinities, all broken concepts.

But that does not fix the illusion magic deliberately thrown upon us when you do simple things in carpentry, like 1/3 * 2/5. Or balls.

[1] Greeks still keep in schools their own original nomenclature as legacy from ancients. They know why. This is the one equivalent Wikipedia entry by language:

https://el.wikipedia.org/wiki/%CE%A0%CF%81%CE%B1%CE%B3%CE%BC%CE%B1%CF%84%CE%B9%CE%BA%CF%8C%CF%82_%CE%B1%CF%81%CE%B9%CE%B8%CE%BC%CF%8C%CF%82 [wikipedia.org]

https://en.wikipedia.org/wiki/Real_number [wikipedia.org]

Rust programming language offends both my Intelligence and my Spirit.

## (Score: 5, Touché) by maxwell demon on Monday February 26 2024, @05:02AM

Not even the natural numbers exist as real entities. They, too, are abstractions. You cannot find the number two anywhere. And no, you can't disprove me by pointing at two things, those are not the number two. Two flowers are something very different than two stones, which are again very different from two humans.

The Tao of math: The numbers you can count are not the real numbers.

Parent## (Score: 2) by PiMuNu on Monday February 26 2024, @12:53PM

> Cultists of the West.

Ah yes, the decadent Westerners and their lazy indulgence in the sin of real numbers.

==

ps: Any (Western, or otherwise) scientist will tell you that a number should always be accompanied by an uncertainty.

Parent## (Score: 1) by khallow on Tuesday February 27 2024, @06:02AM

I s there a serious complaint in here somewhere? It's a label with no semantic connection to normal language. Nobody is fooled into thinking that real numbers are real. And "real" takes less effort to spell than "practical".

Consider this tour of algebraic objects. You're already quite aware of the "set" a collection of objects under some sort of characteristics that might not even be describable in our reality. A "group" is a set with a multiplication-like operator that has associativity, an identity element, and a unique inverse element for every element of the group. A "ring" has a group operation of addition (zero being the identity element for that) and a multiplication operation that is associative and distributes with the addition operator. There are a crazy number of variations of rings, some with horrible names. Then there's a "field" which is a ring where the non-zero elements form a group under the multiplication operation. When in addition you require topological closure of the field, you get into the realm of "real" and "complex" numbers, the latter being the algebraic closure of the former (adding the square root of -1 is sufficient to completely factor all polynomials into linear factors).

My take is that these things have to be called something. And a simple label for extremely commonly used objects is easier on the brain than elaborate labels that don't add anything.

Parent## (Score: 1, Funny) by Anonymous Coward on Monday February 26 2024, @02:24AM

Not much of an invention, really.

For finite decimals, it is merely a handy way of denoting rational numbers with denominators of 10^n, which of course are a mere subset of all rational numbers. The invention of rational numbers, like 1/3, 22/7 or 355/113 happened much earlier and has a much richer application.

Anyway, I always hated first period.

## (Score: 3, Insightful) by pTamok on Monday February 26 2024, @09:37AM

There are several inventions/techniques, found independently, that contributed towards the contemporary methods of arithmetic ( which is a part of mathematics ).

1) Positional notation

2) Using base-10

3) Using an explicit symbol to denote 'zero' (and, indeed, separate unique, individual symbols for each number, so '3' is '3' and not, for example, 'Ⅲ', or Babylonian numerals [st-andrews.ac.uk])

4) Using a symbol to divide non-fractional and fractional parts of a number (or the division between the positional notation fields of base

^{0}and base^{-1})5) Cheap paper (or equivalent)

This is not to say one cannot do arithmetic without any or all of them: obviously, one can - in the past, sand-tables, counting-boards, wax-tablets and abaci were used. It would be a waste of vellum to use it for scratch calculations.

I think the interesting thing here is not that someone discovered the utility of using a standardised divider between the fractional and non-fractional parts of a number, but how long it took to be generally adopted. I would hazard a guess that at the time Giovanni Bianchini starting using the technique, most people could not read or write, and dissemination of novel knowledge between people who could read and write was remarkably slow, partly down to the need to hand-copy books - we are talking the 1440s here, and moveable-type printing was just taking off in Europe.

## (Score: 0) by Anonymous Coward on Wednesday February 28 2024, @04:18PM

The comments are always full of internet experts. It gives me a headache.