Department of Energy Argonne National Laboratory Office of Science NEWTON's Homepage NEWTON's Homepage
NEWTON, Ask A Scientist!
NEWTON Home Page NEWTON Teachers Visit Our Archives Ask A Question How To Ask A Question Question of the Week Our Expert Scientists Volunteer at NEWTON! Frequently Asked Questions Referencing NEWTON About NEWTON About Ask A Scientist Education At Argonne Historical Meter Accuracy
Name: Crystal S.
Status: Student
Age: 14
Location: N/A
Country: N/A
Date: 2001

Is the early 1700 French calculation of the distance from the equator to the North Pole correct by today's technology? They calculated that their meter would represent one ten millionth of the distance from the equator to the North Pole.

The historical definition of the meter defined by the French Academy of Sciences as 1/10,000,000 of the quadrant of the earth's circumference running from the North Pole, to the equator, through the city of Paris (of course) was replaced in 1889 by the distance between two scratch marks on a bar composed of 90% platinum and 10% iridium, kept in Paris (I think, but I am not certain). That definition, even though obviously inconvenient since it was located in one place and the limited precision with which two scratch marks could be located due to temperature changes etc. lasted until 1960. Then, the meter was defined in terms of an emission line of the M=86 isotope of the rare gas Kr -- specifically, 1 meter = 1,650,763.73 wavelengths of that orange emission line. This of course had the tremendous advantage that any well equipped lab had access to the primary standard of length.

The explosive advances in technology made that definition obsolete by the 1980's! Specifically, the invention of the laser made it possible to measure the speed of light with an accuracy that surpassed the definition of the meter itself. The international society that concerns itself with such issues realized that the definition of the meter was caught on the horns of a dilemma, since it was clear that the speed of light was only going to be measured with increasing precision and accuracy. The dilemma was resolved by assuming that the speed of light in a vacuum was constant. There are some speculations running around now that this might not be so, but they are speculations, and remain to be proven true. The "best" measurement of the speed of light was 299,792,458 meters / sec. It was decided that the definition of the the unit of length, the meter, be defined in terms of the (constant) speed of light. So the definition of the meter is allowed to "float" and is defined by the distance light travels in a vacuum in 1/299,792,458 'ths of a second. This has the further advantage that so far as we know all electromagnetic radiation, in a vacuum has the same speed, so the definition of the meter does not depend upon a particular substance, Kr.

This however doesn't get one completely off the hook, because the question naturally arises, "Well! What's a second?". Currently, the second is defined in terms of a specific transition in the electronic spectrum of the M=133 isotope of the element Cs. Specifically, the second is the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of that isotope of Cs. The advantage of this is that the second is based upon "counting" the oscillations, which, in principle, is exact.

Look up the NIST website: for the very interesting story of how fundamental constants are measured.

Vince Calder

Click here to return to the Engineering Archives

NEWTON is an electronic community for Science, Math, and Computer Science K-12 Educators, sponsored and operated by Argonne National Laboratory's Educational Programs, Andrew Skipor, Ph.D., Head of Educational Programs.

For assistance with NEWTON contact a System Operator (, or at Argonne's Educational Programs

Educational Programs
Building 360
9700 S. Cass Ave.
Argonne, Illinois
60439-4845, USA
Update: June 2012
Weclome To Newton

Argonne National Laboratory