# Is One A Number? According to ‘Mathematicks Made Easie,’ Yes

The ancient Greeks, and people for almost 2,000 years after them, argued over whether one was a number

“One is the loneliest number” isn’t just a song lyric. For mathematicians, it’s a truth.

One is unique. Four multiplied by one is four. Two thousand five hundred and seventy three times one is two thousand five hundred and seventy three. In mathematical terms, it’s called “unity,” (which is where we get the word “unit” from) and it has even more strange properties: for example, the square root of one is one. Because it is so unlike other numbers, one wasn’t even considered a number for a long time.

One is a number, according to modern mathematics at least, but it’s a strange number: writing in the *Journal of Integer Sequences*, mathematicians Chris Caldwell and Yen Xiong take readers through one’s controversial history.

Originally, because a number was defined differently, one wasn’t considered a number, but rather the font from which all other numbers flowed. Aristotle, Euclid and other Greek thinkers whose work is a foundation of mathematics didn’t think it was a number. Why? One source from the 15th century, Isidore of Seville, described the reasoning of most mathematical thinkers at the time: number should be considered “a multitude made up of units,” the mathematically-minded archbishop wrote. Under this definition, “one is the seed of number but not number,” he wrote. “Number” rather than “a number” was used to denote the whole concept of the world of numbers–a world that anyone who has ever stared at a math textbook in bewilderment can tell you isn’t much like ours.

In the late 1500s, write Caldwell and Xiong, a Belgian mathematician with the musical name of Simon Stevin came along and published a book called *De Thiende*, which explained how to represent fractions (¼ mile) as decimals (0.25 mile). This was a watershed moment in mathematics, the pair write, because one has to be seen as a divisible number for decimals to work.

“Although he did not invent decimal fractions and his notation was rather unwieldy, he established their use in day-to-day mathematics,” writes *Encyclopedia Britannica*. “He declared that the universal introduction of decimal coinage, measures, and weights would be only a question of time.” (In fact, decimalized currency was still considered a risque notion when Thomas Jefferson introduced it in the United States, while the metric system–which is based on the idea of decimalization–was a revolution that America has still not gotten on board with.)

However, this idea took some time to catch on, write Caldwell and Xiong. Almost a hundred years later, an English polymath named Joseph Moxon published the first English mathematical dictionary. Its title: *Mathematicks Made Easie*. Yes, really.

However, the concepts Moxon dealt with were anything but easy. Here’s how he explained the whole controversy surrounding one: Number, at least as "commonly defined," is "a Collection of Units, or Multitude composed of Units,” he wrote. By that definition, “One cannot be properly termed a Number, but the begining [sic] of Number.”

But, he added, even though this definition was still commonly accepted, “to some" including Moxon himself, "[it] seems questionable.” After all, if one was the beginning of the world of Number, it had to be a number. And besides, if one was not a number, then 3 - 1 would be 3 "which...is absurd." This basic argument eventually took hold and one was considered a number, changing math forever.

As for Moxon, mathematicks wasn’t the only thing he made easie: he was also the author of *Mechanick Exercises on the Whole Art of Printing*, the first-ever manual for printers.