This new result contributes to the international effort to define the second with a much greater level of accuracy than before, enabling new scientific and technological advances
There’s a new record holder for the most accurate clock in the world. Researchers at the National Institute of Standards and Technology (NIST) have improved their atomic clock based on a trapped aluminum ion. Part of the latest wave of optical atomic clocks, it can perform timekeeping with 19 decimal places of accuracy.
Optical clocks are typically evaluated on two levels — accuracy (how close a clock comes to measuring the ideal “true” time, also known as systematic uncertainty) and stability (how efficiently a clock can measure time, related to statistical uncertainty). This new record in accuracy comes out of 20 years of continuous improvement of the aluminum ion clock. Beyond its world-best accuracy, 41% greater than the previous record, this new clock is also 2.6 times more stable than any other ion clock. Reaching these levels has meant carefully improving every aspect of the clock, from the laser to the trap and the vacuum chamber.
The team published its results in Physical Review Letters.
“It’s exciting to work on the most accurate clock ever,” said Mason Marshall, NIST researcher and first author on the paper. “At NIST we get to carry out these long-term plans in precision measurement that can push the field of physics and our understanding of the world around us.”
Indulge me in a rant. If we're going to redefine the second because of advancements in measuring sensitivity, doesn't this become a good time to reconsider the SI structure?
Bad approximations of distances in the 18th century brought us the metric system. With the sort of precision we now have, not to mention the need for nongeocentric units as space increasing becomes a field of research, why are we using a flawed system based on guesses from a few guys in France during The Enlightenment?
I've no issue with shorthand like AUs or light-years for large distances, but it feels we should have the basic tenets of the universe as the basis. Like, the light-nanosecond for distance on the human scale (it's about 11.8 inches or 29.98cm) and then reconfigure the system from first principles.
I'm not saying we should throw out measuring systems each time they get more precise, but a lot of cruft is grandfathered in to what we currently use. We can't just go for further precision and then shrug and say "well, nothing we can do about it."
So, if I understand correctly, your beef here is not that the metre is a flawed basis measurement but rather that the U.S. refuses to use metric? That's certainly a hill to die on, but using universal constants to define measurements seems the better route. The foot just as arbitrary as the metre.
Since 2019, the metre has been defined as the length of the path travelled by light in vacuum during a time interval of 1/299792458 of a second, where the second is defined by a hyperfine transition frequency of caesium.[2]
The metre was originally defined in 1791 by the French National Assembly as one ten-millionth of the distance from the equator to the North Pole along a great circle, so the Earth's polar circumference is approximately 40000 km.
In 1799, the metre was redefined in terms of a prototype metre bar. The bar used was changed in 1889, and in 1960 the metre was redefined in terms of a certain number of wavelengths of a certain emission line of krypton-86. The current definition was adopted in 1983 and modified slightly in 2002 to clarify that the metre is a measure of proper length. From 1983 until 2019, the metre was formally defined as the length of the path travelled by light in vacuum in 1/299792458 of a second. After the 2019 revision of the SI, this definition was rephrased to include the definition of a second in terms of the caesium frequency ΔνCs. This series of amendments did not alter the size of the metre significantly – today Earth's polar circumference measures 40007.863 km, a change of about 200 parts per million from the original value of exactly 40000 km, which also includes improvements in the accuracy of measuring the circumference.
If you're using 1 over arbitrary hundreds of millions as a basis of measurement, it's a pretty clear sign the base unit makes no sense and serves to make mathematics more complex, not cohesive.
About redefining part, we have this arbitrary number: 1/299792458 and you basically want to change that? What would it help? I constantly use metric and imperial units concurrently, if you don't need accuracy for 19 decimal places it's not a big deal. 3 feet is 1 meter, 1 inch is 2.5 cm. 1 pound is 0.5 kg, The only one I can't calculate in my head that 1 mile is 1.6 km but if I need quickly then it's just 1.5. For everyday life this accuracy is good enough. I'm an engineer, not scientist.
There are these, but I suspect their main benefit is that they make physics equations use nicer numbers, not as much for the layperson: https://en.wikipedia.org/wiki/Natural_units