GMT vs. UTC

Question: I read the explanation on your Local Time & UT page, but I’m still confused about the difference between GMT and UT. Could you expand on that please? — DV, Tucson, AZ

Answer: Absolutely. Time measurement systems are not the easiest thing to understand. Let’s start with GMT, which was the first global time system.

GMT stands for “Greenwich Mean Time” and it’s measured using a telescope at the observatory in Greenwich, England. Officially known as the “Royal Observatory, Greenwich,” it was founded in 1675 by Charles II for the purpose of improving time measurement and navigation.

What they do at Greenwich is point a telescope straight up, and time the passage of stars overhead. When the same star reappears in the view, Earth has rotated once, and a “day” has elapsed. Averaged over time, this is what defined the length of a mean day (average length of one day). England had a strong incentive for measuring time accurately.

During the Age of Discovery, England was the preeminent seafaring nation. Portugal and Spain were up there too, but England had the clear lead in terms of ships, colonies, and technology. To claim a territory on the globe, it was necessary to describe the location of that territory unambiguously. And for that, you need a number for latitude and longitude.

Measuring latitude was easy. All they had to do was find Polaris (aka The North Star, officially α Ursae Minoris), and measure its angular altitude above the horizon using a sextant (essentially a protractor). That angle is equal to your latitude. Even on a rotating Earth, that always works, because Polaris is directly above the North Pole.

Measuring longitude is much more difficult. When you travel east or west, what you see in the sky rotates around Polaris. To measure your longitude, you need to know what time it is in Greenwich compared to what time it is at your location. Every hour of difference equates to 15° of longitude.

Accurate mechanical clocks became available in the early 16th century. This was partly in response to a £20,000 prize (several million pounds in modern currency values) offered by the British Board of Longitude in 1714. After several attempts by others, that prize was claimed by John Harrison in 1761. Harrison’s chronometer (as accurate clocks were then called) used counter-rotating beams connected by springs and a temperature-compensated bimetallic drive mechanism. Unlike a pendulum clock, his design kept highly accurate time even on rolling seas. See Harrison’s design here.

If you had a clock set to GMT onboard your ship, and compared it to your local time (measured by the Sun as in the graphic above), you could easily calculate your longitude according to the formula: (local time – GMT) × 15° = longitude.

All was cool until about 1950. When atomic clocks were invented, scientists noticed the length of a mean day was not constant. It could vary from 24 hours by as much as several milliseconds (0.001 s). There are several reasons for this. Gravitational interactions between the Earth, Moon, and Sun have been increasing the length of a “day” by about 0.0024 seconds/century for eons. Shorter term variations (±) occur seasonally, and are caused by the normal redistribution of mass around the globe. Rainfall, melting glaciers, earthquakes, sedimentation, prevailing winds, and even large meteor impacts create measurable differences.

Here’s where UT (sometimes labeled UTC) comes in. Adopted in 1972, Coordinated Universal Time is based on a network of atomic clocks that has an accuracy of around one billionth of a second per day (0.000000001 s/day). Every year, in June and December, the “world clock” is adjusted as needed. When you hear about “leap seconds” being added to (or subtracted from) the year, that’s an adjustment being made between GMT and UT.

Of course, for the average citizen, the difference between GMT and UT can be largely ignored.

Next Week in Sky Lights ⇒ How We Measure the Distance to the Moon

Equinox Sunset
Q&A: How We Measure the Distance to the Moon
HOME