Those who have studied some physics might remember why minus 459.67 Fahrenheit is called "absolute zero," but for the rest, it's probably a bit confusing. Switching to Celsius won't help; absolute zero is minus 273.15 degrees on that scale. Is absolute zero ever zero degrees? To find the answer, one needs to look to a scientific temperature scale called Kelvin and how it evolved.
The Fahrenheit scale used in the United States dates back to the early 1700s, a time when scientists realized that they needed a way to measure heat and cold and were inventing thermometers. German physicist Gabriel Daniel Fahrenheit is credited as the first to use mercury in a thermometer, and he created a measurement scale to go along with his invention. On Fahrenheit's scale, zero degrees was the temperature of a mixture of equal parts ice, water, and salt, and 212 degrees was the temperature at which water boiled.
Though Fahrenheit's temperature scale became widely used, other scientists of the time experimented with their own. In 1742, Swedish astronomer Anders Celsius used a 100-degree, or centigrade, scale that set the freezing point of water at zero and the boiling point at 100. (In 1948, the centigrade scale was renamed the Celsius scale by the Ninth General Conference of Weights and Measures in honor of its inventor.) With its similarity to the 10-based metric system, the Celsius scale became the standard in most of the world.
By the early 1800s, scientists studying the behavior of gases had determined that the lowest possible temperature for anything in the universe was minus 273.15 Celsius. And in 1848, William Thomson (who would later be made a baron with the title Lord Kelvin) suggested that it would be convenient to call that temperature "absolute zero" and create a new scale starting there that would eliminate all negative temperatures. The idea caught on, at least in science, and that absolute scale of temperature is now known as the Kelvin scale.