It’s approaching the equinox, that time of the year when the day and the night are of almost equal length. It’s the vernal or spring equinox here in the Southern Hemisphere, and the autumnal equinox in the Northern Hemisphere. For a number of reasons, the day and night are not of exactly equal length, and alternative definition is the time when the plane of the earth’s equator passes through the centre of the sun.
At around this time of the year many countries adjust their clocks to take advantage of the increasing daylight in the evening. Most countries who do this change their clocks forward in spring and back in the autumn, hence the mnemonic “spring forward, fall back”.
The reasons for the usage of “Daylight Saving Time” are debatable. The original intent was to align working day more accurately with the daylight hours while leaving more daylight time at the end of the day. Without Daylight Saving Time, people rose in the morning after an hour of usable daylight had occurred. It was during the two World Wars that “Daylight Saving Time” was first practised extensively in many countries.
Nowadays we are accustomed to “Daylight Saving Time”, and naturally there are dissenters who believe that it is unnecessary or counter productive. A farmer may point out that his cows don’t practise “Daylight Saving Time” and so the changes in the clocks are of no benefit to him, and can even cause him inconvenience.
“Daylight Saving Time” is around one hundred years old, so it is a fairly recent invention. Indeed the synchronisation of clocks, even in a single country, is a recent phenomenon. Now we have clocks synchronised globally.
Computers have clocks. Indeed the very functioning of a computer requires a very accurate clock, so it should be no surprise that we take advantage of this requisite to extend computer clock usage outside of the computer itself.
In the early days of computers, the clocks were not synchronised between computers. In fact that synchronisation had to wait for the development of networked computers. The people who used these isolated computers had to set the clocks manually, which was acceptable when computers were rare, but became a chore when computer usage started to climb and more desk had computers on them.
In a computer there are two sorts of clocks, a hardware clock and a software clock. The hardware clock is the fundamental clock in a computer system and it ticks thousands of times per second. If you’ve ever browsed the specs of a computer system you will likely have noticed the clock specification, the (these days) gigahertz rating. This is closely related to the clock speed, and the number of operations that the computer can perform in one second.
The original computers had speeds rated in kilohertz, so today’s computers are of the order of one thousand million time as fast as the old klunkers.
The software clock is related to the hardware and takes the clock information and translates it into a human usable date and time. It can’t do that without reference to the outside world as the hardware clock consists merely of a stream of “ticks” and doesn’t understand the concept of seconds, hours, days, months and days of the week. There is no weekend in the hardware clock’s world.
The reference to the outside world in the early days of computing meant the operator typing in the time, and the software clock relating that to a tick of the hardware clock. From then on the software clock just counts ticks and works out the human usable date and time from that.
As computers started to be networked together, a problem arose. Computer A’s and computer B’s clocks will have been set by a human to as close as the human can manage, but they may be several seconds apart, a lifetime in computer terms. This can cause issues like money appearing in bank accounts before the money disappears from the sending account when the transaction is automated. All transactions are automated these days.
At the same time as computers got networked, some far seeing people decided to set up a network of atomic clocks. These clocks are much more accurate than computers hardware clocks which can “drift”, because not all computer clocks tick at exactly the same rate. As a service this service is provided on the Internet and this has almost universally been adopted.
Your computer will contact a local time source, which contacts a less local time source, and so on until one of the top tier time sources is connected. Thus they all synchronise with the top tier time source. All the top tier sources synchronise with each other so eventually all computer clocks are synchronised.
A computer synchronises with its time source by basically sending a packet of data to its time source and the time source replies. The computer compares the times and repeats the process a few times to get an average, and then, since the packet has to go out and back, halves the average and estimates the time at the time source as the time sources knows it. Then it sets the hardware clock to match. It continually does this, constantly updating its clock as necessary, which gives a very accurate value for the local time.
One might question the necessity of this accuracy. Isn’t it being a bit pointless to set clocks with such nit-picking accuracy? In a news story which I can’t now track down, a financial organisation lost millions, maybe billions of dollars because they did not handle a “leap second” accurately. Automatic stock market trading programs made thousands of trades in the few milliseconds that the company was out of sync with the rest of the world. But to you or I, doing our “online banking”, it won’t matter.
It’s worth remembering that the world-wide time system is pretty new. In these days we are accustomed to be able to contact someone on the other side of the world and to know what the time is with the contactee. But this is new.
It used to be the case that the local time was a sort of local consensus and did not rely on clocks. Then when clocks became more common the local reference time source was the clock in the spire of the church. Time was still local as the church clocks were not synchronised.
As time keeping became more important, the local time zone might expand to cover a time or a city. Clocks could be synchronised in a small area by use of travelling clocks or watches, and really accurate clocks and watches enabled the explorers from Europe to travel the world.
The advent of long distance travel (by the railways) and of telephonic communications resulted in the need for consistent time information across countries and across continents. However, standard time was only legislated in the United States in 1918, and this subsequently spread to the parts of the rest of the world that were not using Greenwich Mean Time.