The Anderson Institute Logo 
   Where history is becoming an experimental science      
Innovation and Excellence in Time Technology The Official Site of the World Encyclopedia of Time, the Time Shop, and the Time Research Association.    
   
  Home  |  About Us  |  Educational Resources  |  Encyclopedia  |  The Time Shop  |  Time Research Association  |  Contact Us

Keeping Time

Have you noticed that as man evolved over time, the complexity of his lifestyle evolved right along with him? New questions were asked, new discoveries were made, and new technologies were created to keep pace with man's burgeoning quest for new and better. The methods used for measuring time were no exception. They evolved dramatically as well since they were first introduced.

It is believed that as early as the last ice age -- twenty thousand years ago -- models of the first calendars were poked into the sand with sticks by Vikings hoping to chronicle the cycles of the moon. The Sumerians divided the year into thirty-day months and the Egyptians created the concept of a three-hundred-and-sixty-five day year, long before the first year of the Common Era (also known as “AD”) began.

There was really no call for people to know what time of the day it was until approximately five thousand years ago. That is when the first sundials and water clocks came into existence in order to service that need.

As civilization progressed, so to did the desire to increase the accuracy with which time was recorded. Advancements in technology and the development of human ability allowed this to happen. Rising from all this is the ongoing story of the advent of calendars and clocks, starting as early as Stonehenge and continuing on past the age of digital watches.

A trip through time - Keeping Time - Clocks

How many hours are there in a day?

It was the Egyptians who first answered “24" to that question, thereby beginning a long and lasting trend that we still follow today. The Egyptian day was composed of twelve night hours, ten day hours, and two twilight hours, one right before the sun came up and one right after the sun went down.

The Egyptians were only one of many ancient peoples, including the Babylonians, Chinese, Greeks, and Romans, who developed a sundial for use in counting hours. Using this as a foundation, water clocks and sand clocks soon followed.

Along with each new time-measuring device came a surge of interest in measuring time more accurately until eventually the first mechanical clock that needed neither water nor sand was constructed. In the sixteenth century, Galileo Galilei did a lot of work with pendulums, and John Harrison spent nearly his entire life trying to construct a clock that could help sailors determine longitude.

In 1665, the invention of the hairspring led to an increase in the accuracy of small timepieces. Automatic winding timepieces were designed in 1770 and by the early 1900’s, watches that were worn on the wrist, were the ultimate accessory. Novelty watches were also the fashion, with Walt Disney’s “Mickey Mouse” making his debut in 1934, pointing out hours and minutes.

By 1952, the first digital watch was manufactured. And today, modern day watches allow you to do just about anything including monitoring your heart rate, watching TV and connecting to the internet all the while doing what they were originally intended to do: keep accurate time.

All of these developments demonstrate man's continuing interest in time and his desire to measure and record it accurately. The importance of the clock stems from its ability to help humans keep track of time, a commodity that is too precious to waste.

Keeping Time - Clocks - People Being Telling Time

 “Hide not your talents. They for use were made. What’s a sundial in the shade?”.
Benjamin Franklin

Imagine what life would be like without any clocks at all. Essentially, there would only be day and night. When man first became interested in keeping track of time, he turned to the first natural clock available: the sun. Though night was considered to be a different and possibly dangerous phenomenon, daytime, thanks to the sun, was considered to be both friendlier and more utilitarian and it could also be measured. The shadows that the sun cast were first harnessed by the Egyptians while designing a device to count hours. A stake called a gnomon, named for the Greek word “to know,” was placed upright in the ground. The higher the sun was overhead, the shorter the shadow would be; it lengthened both in the morning and toward the end of the day. When the sun was in the east, the shadow it cast fell to the west, and vice versa; so, both size and direction of the shadow were factors in determining the time. By slanting the gnomon, people could compensate for the fact that days were not all the same length, but rather longer in the summer. Such techniques kept their calculations accurate. Sundials were the main tool used to measure time right up until the beginning of the Renaissance in the fourteenth century.

The major shortcoming with sundials was that they required the sun. This made them useless when it was cloudy, at night, or while inside a building. Water clocks were the next major invention, but fault could easily be found with them as well. Precise water levels were marked off in a jar that was filled by water trickling into it from another jar above it. Despite the accuracy of the markings and the timings associated with them, there was no way to account for the inaccuracies that occurred when water froze or evaporated too quickly so water clocks soon fell into disfavor.

Sandglasses came next. They worked well when trying to determine short periods of time, but it was very difficult to find one that could last through the night, and people generally didn’t wake up in the middle of the night to flip the sandglass over at precisely the right time. Sandglasses were also used for a myriad of other things from measuring cooking times to measuring the length of sermons and lectures. They were also used by sailors to measure the speed of their ship. A long rope was knotted every 47 ¼ feet and thrown over the stern of the boat. Sailors then pulled the rope back onto the ship while the sandglass measured how long it took them to do so. If they pulled 47 ¼ feet of rope (on knot’s worth) back onto the ship in 28 seconds, the ship was traveling at a speed of 1 nautical mile (or one knot) per hour.

The main problem with both of these methods was that they relied upon the flow of something, water or sand, to count and track time.

Harnessing the power of fire was the next step man took as he traveled through the evolution of time measuring devices. Scholars observed that if the rate that a candle burned could be determined, then candles of uniform width with horizontal markings on them could be used to measure how much time had passed. And so, the concept of candle clocks was born. Once nice thing about candle clocks was that, unlike sundials, these measuring devices could be used inside as well as at night just as well as they could outside during the day.

Despite all of these attempts at improvement, the sundial was still the most widely used tool for measuring time right up until the fourteenth century when mechanical clockwork devices took over. The tools used to measure time may have changed dramatically over the last 4000 years, but the search for easier to use and more accurate devices still goes on.

Keeping Time - Clocks - The ticking of the clock

The clock, not the steam engine, is the key machine of the modern industrial age.
Lewis Mumford

Rotation of the Earth
The only part of a clock’s makeup that could be considered “natural” is its twenty-four hour cycle. Twenty-four hours is the amount of time it takes for the earth to rotate once on its axis, so it makes sense for that to be the amount of time used to qualify a day. Within the realm of that day, however, there was never any rule saying that we must have 60 minutes in an hour and 60 seconds in a minute; that concept is entirely man made. That subdivision originated because the Mesopotamians didn’t like having to deal with fractions all the time. Since 60 can be divided in so many different ways, they chose to base their time system off of the number 60, and so it continues until this day.

When mechanical clocks came into being therefore, it was natural to develop clocks whose hours would always have 60 minutes. Well . . . maybe not always. During the time of the French Revolution, from 1793 to 1795, they completely changed around the clock so that there were 10 hours in a day, 100 minutes in an hour, and 10 seconds in a minute. The decimal system was accurate, but too hard for people to learn, and so it didn’t catch on. Although the mechanical clock didn’t invent hours and seconds, it standardized them. Whether keeping track of 60 or 100 minutes in an hour, imbalances in nature could not alter exactly what the hands on the clock stood for.

Prior to pendulum clocks, mechanical clocks couldn’t accurately keep track of seconds. Though it was originally Galileo who observed the accuracy of the movement of a pendulum, it wasn’t until 1656 when Christian Huygens patented the first pendulum clock. Though the first pendulum clocks were driven by weights, most of the ones you buy nowadays have the pendulum movement powered by an electric battery.

The quartz era of clock making followed right behind the pendulum’s. Quartz crystals vibrate millions of times per second with such precision that they reveal irregularities in the movement of the earth itself. J.W. Horton and Warren A. Marrison developed the first quarts clock, though it took up most of the space found in a small room to store it in. Quart movements are now the most popular type on the market and are found inside everything from calculators, computers, wristwatches, and of course clocks. Nowadays, quartz clocks are so precise that they allow even the casual user to keep track of time to the point where only one second is lost every ten years.

Though the progress made from the origin of the mechanical clock up through to the achievements of the quartz clock was broad and far reaching, more precision was yet to come. The second, as we knew it, was soon to be reinvented with the arrival of the atomic clock where scientists and engineers now keep track of milliseconds, microseconds, picoseconds, and all the way up to nanoseconds and beyond. This ever continuing march toward measuring perfectly accurate time seems to leave forgotten the time when clocks first starting measuring hours without minute hands, and when seconds were only imagined, not made increasingly smaller.

Digital Clocks

“The hours of folly are measured by the clock, but of wisdom no clock can measure.”
William Blake

The first digital watch was produced in 1952, and since then they’ve become the most popular form of time-telling device available to the general public. Most have a four to six digit display powered by either light emitting diodes (LED) or a liquid crystal display (LCD). Both use small enough amounts of electrical power to allow their batteries to last for a long time. Some can even last for up to 300,000 hours—the equivalent of approximately 34 years.

Individual digits on these clocks are formed by merging up to seven separate lines that connect to form the various shapes needed to display a certain number. If you look closely at the numbers being formed by a digital watch or clock, you will notice they are not solid figures, but rather a specific combination of these lines.

Since its development in the early 1950’s, the digital clock has come a long way. It took thousands of years to develop a clock that ticks while running on mechanical energy and another few centuries after that to produce one that utilizes the more accurate and longer lasting power of the battery. In the fifty years since battery or electrically powered clocks were first introduced, these timepieces have been able to measure much more than just time. Now these electrically powered clocks can come equipped with talking alarms, Global Positioning Systems (GPS), radios, thermometers, timers, calendars, as well as moon, sunrise, and various time zone indicators. In a way similar to that of the ancient Egyptians who perhaps never envisioned counting minutes, the makers of the first mechanical clocks at the time of the Industrial Revolution probably never imagined all of the things that people would be able to do with battery operated devices only two hundred years later.

Some people worry about the effect that digital clocks will have on the upcoming generation of children and their ability to tell time. Back when the majority of clocks were analog, with hour and minute hands on their face, it was more difficult for a child to tell time because they had to interpret what role the hour, minute and seconds hands played in the time-telling process. Now, all they have to do is read the numbers on the screen. Do all children know what “quarter past” or “quarter of” an hour means? In some languages such as German, referring to increments of fifteen minutes in this way is the most common way of communicating this time interval, while in America “fifteen past” is acceptable as well. Without the hands of a clock present, there is no way to literally tell that fifteen is one-fourth of the way around the clock. Therefore, with a digital clock, there would be no logical reason to assume that fifteen is equal to one-quarter of anything. This is especially so because people are used to dealing with things like our monetary system where 25, not 15, is often associated with a quarter of something. Some believe that the widespread use of technology, like digital clocks and calculators, will lessen the mental ingenuity of our youth. Will cultures with fewer digital clocks and fewer calculators in use produce better thinkers and better mathematicians? Only time will tell.

The Age of The Atom: reaching perfect time

“Life is a constant oscillation between the sharp horns of dilemmas.”
H.L. Mencken

An Atomic Clock
An Atomic Clock
We are currently living in the age of atomic time. This is the culmination of a dream of countless astronomers and scientists—to finally be able to keep track of time “perfectly.” Even thought it’s known and accepted that we can now measure time to a minute fraction of a second, it is also known and accepted that the earth neither rotates on its axis nor about the sun in a constant manner. Because of the latter, we will always have to adjust our time to agree with the earth’s movements no matter how accurate we can measure time. Otherwise, as we have discussed in previous articles, our calendars and seasons will eventually be off. So the question can be posed: could it be that time as it is recorded now is too perfect?

The question “what time is it?”can no longer be answered by looking to the sky. Some like to whimsically say that time is the way we divide up eternity or that time is what you experience while beach sand flows through your fingers. But if it’s a time standard you are looking for, the new home for the time standard in the United States is the US Naval Observatory in Washington, DC. When the USNO opened in 1830, its main purpose was to keep track of most of the Navy’s navigational tools. It has since become the government’s source of precise time (USNO), supplying accurate time to the government and all branches of the service via the Department of Defense. It is also the center for tracking all GPS systems worldwide. Currently, USNO utilizes 61 atomic clocks, among other things, to keep its time keeping precise. Watching time this closely is important because modern electronic systems depend on precise time to work properly. For example, LORAN navigational systems and GPS systems would position a body 10 feet off its mark for every 10 one-billionths of a second your time was off by.

Atomic clocks are not named because they are powered by atomic energy but because they use certain properties of atoms that can be measured precisely and then used to define time. Time is literally being measured based upon the behavior of the atoms of a given element. Today precise time, currently the most accurate way possible to measure time, is calculated according to the oscillations of the cesium-133 atom. Though all atoms oscillate or vibrate, some do so more regularly than others. The first atomic clock, based upon ammonia, was built in 1949. However, it soon became apparent that the cesium-133 atom oscillated more frequently (9,192,631,770 times per second) and more regularly than an ammonium molecule so atomic clocks began to be built in 1952 around the cesium atom. In 1967 the World’s Conference on Weights and Measures decreed that the world standard for measuring the second would now be based on the vibrations of the cesium atom. The atomic clocks built in 1967 were 1 second off every 1.4 million years. The atomic clocks being built in 1999 were 1 second off every 20 million years. It is also worth noting that though the use of anything besides cesium may not be internationally recognized, atomic clocks can also be built using hydrogen or other elements that may perhaps lower the overall cost.

Two of our most current time standards (TAI and UTC) are derived from this atomic system of measurement. Right now the Bureau International des Poids et Measures is in charge of keeping both of these atomic time scales as accurate as possible by keeping them closely inline with the cesium based SI unit of the second. SI is an abbreviation for the International System of Units, which maintains the standards for time (second), and many other measurements including length (meter) and mass (kilogram).

The International Atomic Time (TAI) is the first of these two time standards. It is based upon information gathered from observing cesium oscillate in hundreds of clocks at the same moment in time in laboratories worldwide. This information is funneled to a central location where it is compiled and used to regulate the passing of time down to the nanosecond. Since TAI is so uniform and unwavering, it does not account for the variations in the rotation of the earth. Therefore Universal Time (UTC), which is the second time standard and the one commonly adhered to by civilians throughout the world, has been derived from International Atomic Time (TAI). What makes Universal Time (UTC) different is the inclusion of leap seconds, which allow atomic time to make adjustments to its otherwise constant rate for the sake of alignment with the earth’s rotation on its axis and about the sun.

In 2002, the United States National Institute of Standards and Technology (NIST) built a clock that would not be off by more than 30 billionths of a second for the entire year. This was possible due to the fountain principle, named for the fountain like motion of the cesium atoms involved. It was applied for the first time to the NIST-F1 atomic clock in Colorado. This NIST clock is among one of the most accurate in the world, with a fault so low that the clock would not gain or lose more than a second every sixty million years. Now you can clearly see why, even though the flaws in the earth’s rotations are relatively small from our perspective, adjustments to the way we present time to the public are needed to accommodate for these natural flaws.

There are over sixty institutions worldwide like NIST, all striving to develop a timekeeping device that is 100% accurate. Together, the United States, France, countries that hold national laboratories for the study of time measurement, and all countries that adhere to UTC, are forming a bond to make this dream become a reality. If we all work together, maybe the homeowner will soon have clocks that automatically adjust for daylight savings time and automatically adjust to 100% accurate atomic clocks through radio waves. The strides we have made in the last 50 years have been monumental. Who knows what the next 50 years will bring.

When Time Gets Personal: A History of the Watch

Where human anatomy meets data processing, there are just two important devices: the brain and the wristwatch. The brain is nice, but it doesn't tell time very well.
James Gleick

Since their conception in the 1400's, watches have come a long way. They are, perhaps, the most effective way that man has yet devised to keep track of personal time. One no longer has to observe the sun, travel to the center of town, or listen for distant bells to figure out what time it is; instead, he can simply look down at his wrist or pull out a pocket watch. Some modern watches even glow in the dark, because it’s important that if we do wake up half way through the night, we know we did so at exactly three hours, thirty-four minutes and ten seconds past midnight.

In 1485, Leonardo da Vinci sketched the first fusee for a clock. A fusee is a grooved pulley that stabilizes the rate at which winding clocks move from one tick tock to the next. Over one hundred years later, this device was adapted for use in watches as well. In 1524, Peter Henlein of Nuremberg, Germany manufactured the first pocket watch, also known as the “Nuremberg Egg” for its shape and place of origin. He is also credited with inventing the first portable watch using a mainspring. The mainspring powered the clock by using energy it had stored by winding it. The more you wound it, the more energy it had to move the gears that moved the hands of the clock. Other similar watches were soon to follow but all were notoriously inaccurate, had only one hour hand because people kept time by the hour and not the minute then, and had to be wound twice a day.

The 1600's produced little advancement in the technology used to produce watches. Though watches during this period did not get any more accurate or dependable, they did get much more ornate. They were gilded, engraved, and jeweled to create items that were worn more for jewelry than as a time keeping device. They were crafted with different themes, from animals to religion. These watches later came to be known as “form watches” and were only owned by the nobility and a few wealthy citizens because of their cost.

In 1675, a Dutchman by the name of Christian Huygens invented the spiral balance spring, which provided a more accurate and longer lasting source of energy for clocks when they were wound up. By 1690, time could be measured to an even smaller unit than the minute, allowing watchmakers to add a second hand to their watches.

The 1700s brought about another phase of development for the watch. As early as 1704, Peter and Jacob Debaufre developed a method to drill very small holes in rubies allowing them to be used as bearings in watches. This made the watches much more accurate but also too expensive for anyone except the extremely wealthy. At the same time, watchmakers began applying a layer of enamel to their watch dials to give them a light background and make them more visible in low light. Other advancements during this time included the use of a lever designed to mesh with a toothed wheel at certain intervals. An Englishman by the name of Thomas Mudge began using this new device, called an escapement, as a regulator for the watches he produced. Abraham Louis Perrelet of Switzerland developed the first self-winding, perpetual watch around 1770, and is now known as the father of the automatic clock because of it. In France, Mr. Breguet gained the status of greatest watchmaker of all time due to his introduction of the tourbillion. The tourbillion is a design that compensates for the difference in time a watch measures when it is held in different positions. He was also influential in the initial production of the perpetual calendar on a watch face and shock proofing. For the next half century, America led the world in the mass production of watches beginning with Luther Goddard opening up the first American watch manufacturing plant in Massachusetts during the early 1800s.

The first wristwatches began appearing in the early 1900s and were "discovered" in a very unusual place: the battlefield. Up until WWI, men's watches were pocket watches. But pocket watches were cumbersome to carry and difficult to use on the battlefield so soldiers began to attach them to their wrists with leather straps thus freeing up their hands during battle. The idea caught on and both the British and German services began to provide their soldiers with “wrist” watches. After the war, the soldiers were allowed to take them home. People began to see their heroes back home using them and the fad caught on overnight. Because of their wartime success, by the end of the 1920s, over half of the watches sold were now wrist watches.

This was about the same time that watches were being made more robust and the Swiss began leading the world in their production. Rolex, a well known Swiss firm, can be credited with leading the charge with many improvements including the first waterproof watch in 1926, the first self-winding watch in 1931, the first watch to display the date and the time in 1945, and the first deep-sea diving watch in 1954.

A large improvement in accuracy was achieved beginning in 1929 when quartz was introduced into watch making. Novelty watches began to appear as well. In 1933, the first wristwatch made for children, the Mickey Mouse watch, was produced. Battery operated watches debuted around 1953. These watches were so accurate and so inexpensive to produce, that the production of mechanical watches nearly ceased. Nostalgia is bringing them back so that today, mechanical watches hold a 30% share of the market.

By the mid-1900’s wristwatches had alarms, self-winding became the norm, and eventually things like being waterproof, showing the date, being battery driven, and using LCD displays instead of hands were common features. Today, there are a countless variety of watches found on the market. Some are very ornate and used more for jewelry items while others have so many built in features (alarms, timers, MP3 players, radio and TV players, digital voice recorders, heart rate monitors, and Bluetooth technology) that it’s hard to tell what their primary function is meant to be. No matter how you mask the instrument, watches are still watches and are meant to do what they were always intended to do: that is tell time.

Even the lowly wristwatch has surpassed many of the predictions and limitations envisioned by its originators. That leads us to question – how will they change in the future? Will we eventually be carrying around entire computer screens or phones on our wrists? Questions such as these will only ever be answered by the passing of time.

Keeping Time - Calendars through the ages

“Events in our lives happen in a sequence in time, but in their significance to ourselves they find their own order, in the continuous thread of revelation.”
Eudora Welty

Calendars have been around for as long as people have recognized the movement of celestial bodies. It was the position of the moon, sun, and stars that first helped us begin recording months and years. The earliest civilizations made progress by noticing the amount of time that passed in between phases of the moon as well as changes in the placement of stars. Even today’s calendars have a largely astrological basis; the day, month, and year are calculated based upon the rotation of the earth on its axis, the rotation of the moon around the earth, and the rotation of the earth around the sun, respectively. An exception to this general rule is the week. Reforms to any given calendar system, and there have been many, take a long time to implement. After over three centuries of slow integration, the Gregorian calendar has finally become today’s internationally accepted calendar. Isolated calendars harbored by certain cultures, such as the Hebrew calendar of Israel, the Islamic calendar, the Indian calendar, and the Chinese calendar have not been abolished by such unifying reforms however, and are still in use within those cultures today. Unique perceptions of time highlighted in this way, add a uniqueness and relevancy to an almost universally accepted concept and emphasize the fascinating importance of culture, too.

The Julian calendar of Caesar’s time hasn’t been cast aside nor forgotten either. It is still used to relate time as we record it now, to time as it was recorded in the past, before the reforms Caesar made during his “year of confusion.” That is the name commonly given to 45 B.C.E., when Caesar inserted ninety extra days into the calendar in an attempt to fix the calendar system rather quickly. That would sure solve a lot of problems for a procrastinator! Calendars were a major step in the progression of man’s ability to tell time. After man figured out what part of the year it was, and eventually what month it was and then which day it was, it was only a matter of time before he was driven to find ways to measure the hour, the minute, and the second. Even though the limits of time measurement are now being pushed farther than ever before by today’s highly technological clocks, it’s interesting to note that the structure of the standard calendars are not being questioned. Considering all of the changes that brought humanity to where it is today, it is interesting to ponder how future changes will impact the standards by which we run our lives.

Calendars through the ages - Ancient calendars

Thirty days has September,
April, June, and November.
All the rest have thirty-one
Except for February
Which has twenty-eight
And on a leap year
Has twenty-nine.

Ancient mnemonic rhyme


The Beginning of the Calendar

In Rome, around the time of the eighth century, priests routinely called out the arrival of the new cycle of the moon, thereby announcing the beginning of a new month. The Romans referred to this as Kalends, from their word calare, which means to announce solemnly. From this tradition, the word calendar was born. Long before that, the Egyptians, soon followed by the Babylonians, formulated 365-day years and began the process of measuring time. These people initiated what are now collectively referred to as the ancient calendars.

Celestial Bodies

The sun was the first item used to help people create a calendar. It was useful for figuring out how many days had passed by, but it didn’t really provide clues as to which day it actually was. The Babylonians were the first to set up a system to differentiate between days based on a lunar month, or the time that passes between the appearance of two full moons. A more accurate measure of a lunar month would be twenty-nine days, twelve hours, forty-four minutes, and 2.9 seconds. A 12-month year based upon an accurately measured lunar month, however, only adds up to 354.3 days. That means that even by the third or fourth year into this system, the day marked off for the beginning of the harvest would no longer be in the right position, or over time, even in the right season because, a year is not really 354 days long. Due to this discrepancy, a day thought to be June 1st , according to the calendar, would eventually be May 1st , according to the season. To fix this, the Babylonians simply added days onto their calendar through a process known as intercalation. This process still goes on today when dealing with leap years.

The ancient Egyptians utilized the stars instead of the moon to keep track of time. They noticed that the stars didn’t move across the sky randomly, but rather moved in a pattern. They also determined that the brightest star, now known as Sirius, could be conveniently used to keep track of this movement. Every three hundred and sixty five days, Sirius was right at the horizon in the morning. Soon after that, the Nile always flooded, and so this was used to mark the beginning of the year. The problem of Sirius showing up one day late every fourth year was acknowledged by the Egyptians only in that they knew this happened and accurately predicted the timing of the floods accordingly, but they never changed the calendar to reflect the extra one-fourth of a day they knew each year had.

Ancient Rome

The internationally accepted calendar of today has its origin in Rome. Originally it had ten months, known as Martius, Aprilis, Maius, Junius, Quintilis, Sextilis, September, October, November, and December. If you look at the root words of these months starting with Quintilis, it can be seen that they have been named for the numerical order that they are in - five, six, seven, eight, nine, and ten. Of course today November isn’t the ninth month, and December isn’t the tenth. The Roman year however only added up to 304 days, so King Numa of Rome tried to fix the problem in 712 BC by inserting Januarius and Februarius at the beginning, right before Martius. Even after he modified the days that were allotted to each month, the calendar still wouldn’t work, as it then had 366 days.

Julius Caesar was faced with a calendar disaster when he was in power. The calendar was seriously out of pace with the seasons and people were frustrated because traditional holidays were no longer being celebrated at the same time of year they were when Rome was first founded. In order to return the traditional holidays to their proper place within the solar year, he added 67 days to the year 45 BC and introduced a modern calendar with twelve months of 30 and 31 days. This established a system with 12 months and 365 days, with one extra day every four years, just like we have today. Thus the Julian Calendar, named after Julius, was born. That’s also not the only thing that was named after him; he changed the name of the fifth month from Quintilis to Julius as a reward for all of his efforts. When the next emperor came to power he took it upon himself to make one final adjustment. The sixth month, formerly known as Sixtilis, became Augustus in honor of him, Caesar Augustus. He also stole a day from February so that his month would have 31 days like Julius’ did, making February even shorter than it already was with a final tally of only 28 days.

It seemed as though the calendar problem had finally been fixed because these reforms lasted for over 1500 years. As it turns out, however, a year isn’t exactly 365 1/4 days long. That would be 365 days and 6 hours. In reality, a year is only 365 days, 5 hours, 48 minutes and 46 seconds long. This 11 minutes per year discrepancy added up over the centuries and finally threw things out of whack once again. By the sixteenth century, the Vernal Equinox had slipped from March 21 to March 10 so Pope Gregory XIII made yet another shift in the calendar in 1582. To drop the excessive 10 days that had accumulated over the years, he decreed that the day after October 4, 1582 would be October 15, 1582. He also changed the way leap years were to be calculated so that years ending in 00 were no longer leap years. And lastly, he changed the beginning of the calendar year from March 25 to January 1. Though it took over 100 years to have his calendar accepted by all nations, it is now globally accepted for purposes of trade and daily business.

Calendars through the ages - The Date Today

“Most modern calendars mar the sweet simplicity of our lives by reminding us that each day that passes is the anniversary of some perfectly uninteresting event.”
Oscar Wilde

The calendar we use today is called the Gregorian calendar. It is the direct result of changes made in 1582 A.D. by Pope Gregory XIII to the Julian calendar that was in use up to that time. By the time Gregory became Pope, church holidays, like Easter and seasonal plantings were no longer in synch with their appropriate seasons because the Julian calendar was recording dates a full ten days ahead of the actual time of the year. Gregory’s remedy for this was to simply eliminate ten days from the calendar to catch up. He issued a papal bull to that effect stating that in 1582 the day after October 5th would be October 14th. Gregory and his advisors had determined that the Julian calendar overestimated the solar year, each year, by 11 minutes and 14 seconds, which is what accounted for the excess of ten days. His new calendar operated on the assumption that a year was 365 days, 5 hours, 48 minutes, and 20 seconds long. The 11 minutes gained each year by the Julian system resulted from assuming that a year is exactly 365 days and 6 hours in length. Not even the Gregorian calendar is perfect, however. It is off the mark by 25.96768 seconds every year, which has resulted in the pile up of 2 hours, 59 minutes, and 12 seconds of extra time in the 414 years since it was established (Duncan). By 4909 A.D., if things continued the way they have been for the past few millennia, the Gregorian calendar will end up being a full day ahead.

The Gregorian calendar was the first calendar to really gain international acceptance. Most Catholic countries accepted it before 1584. In Protestant Germany, full recognition wasn’t achieved until 1775, though they partially acknowledged it as early as 1700. In Britain and the American colonies, it was adopted in 1752. Japan held off until 1873, and China until 1949. The Eastern Orthodox Church voted, as recently as 1971, to continue using the Julian calendar and to ignore the Gregorian one, which is a practice still followed to this day.

Just because these countries have accepted the Gregorian calendar, that does not mean the system they were using beforehand vanished. Many different cultures still utilize their own calendars within their own national borders. The Hebrew Calendar, which is still the official calendar of Israel, is an example of this. It is a lunisolar calendar that consists of months of either 29 or 30 days, depending on when the new moon, or Molad, occurs. The names of the Jewish months are Tishri, Heshvan, Kislev, Tevet, Shevat, Adar, Nisan, Iyar, Sivan, Tammuz, Av, and Elul. The only day of the week that has a name is the Sabbath, which begins at sunset on Friday and ends at sunset on Saturday. All other days of the week are merely labeled with numbers. In an ordinary, complete year, the Hebrew calendar consists of 355 days. In a leap year it has 385 days, and thirteen months instead of twelve.

The Islamic calendar is purely lunar, and for religious purposes, each month begins at the sighting of the lunar crescent after the New Moon. Days begin at sunset on the previous day. The Islamic months are known as Muharram, Safar, Rabi’a I, Rabi’a II, Jumada I, Jumada II, Rajeb, Sha’ban, Ramadan, Shawwal, Dhu al-Q’adah, and Dhu al-Hijjah. Muharram, Rajab, Dhu al-Q’adah and Dhu al-Hijjah are holy months, and Ramadan is a month of fasting.

Although the Indian calendar underwent changes in 1957 and is now inline with the Gregorian calendar as far as leap years are concerned, it still has its own months that originate from the Saka Era, a traditional epoch. The first Indian month, Caitra, begins on March 22nd, followed by a succession of 11 either 30 or 31 day months. These are Vaisakha, Jyaistha, Asadha, Sravana, Bhadra, Asvina, Kartika, Agrahayana, Pausa, Magha, and Phalguna. Many local regions in India still maintain their own calendars, despite attempts at unification. When reform was first introduced, there were 30 different calendars circulating around the country, making the history of the Indian calendar one of the most complex in the world.

The Chinese calendar may be the most well known of all culturally based calendars. It is preserved and used not only throughout the traditional Chinese countryside, but also in numerous Chinese communities around the world. This system has a cycle of sixty years which is created by matching one of its ten celestial stems with one of its twelve terrestrial branches.
The ten stems are associated with five elements: stems 1 & 2 = wood; stems 3 & 4 = fire; stems 5 & 6 = earth; stems 7 & 8 = metal; stems 9 & 10 = water.
Each terrestrial branch represents a year and each year is named after an animal. The twelve terrestrial branches are zi (rat), chou (ox), yin (tiger), mao (hare), chen (dragon), si (snake), wu (horse), wei (sheep), shen (monkey), you (fowl), xu (dog), and hai (pig).

To help demonstrate how this 60-year cycle operates, lets match stems with their appropriate branch and denote each by their corresponding numbers as follows:
Year x = (stem, branch)
Year 1 = (1, 1) = (wood, rat)
Year 2 = (2, 2) = (wood, ox)
Year 3 = (3, 3) = (fire, tiger)
. . .
Year 10 = (10, 10) = (water, fowl)
Year 11 = (1, 11) = (wood, dog) (you recycle stems when you run out of them)
Year 12 = (2,12) = (wood, pig)
Year 13 = (3, 1) = (fire, rat) (we continue in this manner going through 6 cycles of stems Year 14 = (4, 2) = (fire, ox) and 5 cycles of branches for a total of 60 years)
. . .
Year 60 = (10,12) = (water, pig)

The stems repeat six times and the branches repeat five times in a 60-year cycle. The initial year, jia-zi, of the current 60-year cycle began most recently in February of 1984. 2007 is consequently year 24 in the cycle = (4,12) = (fire, pig) and is referred to as the year of the pig.
In addition to months, this calendar also has “terms,” that usually, but not always, correspond with the twelve months in the year. There are principle and secondary terms that have an average length of thirty days. The principle ones are Rain Water, Spring Equinox, Grain Rain, Grain Full, Summer Solstice, Great Heat, Limit of Heat, Autumnal Equinox, Descent of Frost, Slight Snow, Winter Solstice, and Great Cold.

Even though the Gregorian calendar is widely accepted throughout the world, it is not universally accepted. Many cultures still cling to their own calendars for spiritual or historical purposes. Examples of the differences in different cultural calendars can be seen in the following list from David Duncan’s Calendar which indicates the year that correlates to 2000 AD for those who are not living under the Gregorian calendar.

According to Christ’s actual birth, it was 1997.
According to the old Roman calendar, it was 2753.
According to the ancient Babylonian calendar, it was 2749.
According to the first Egyptian calendar, it was 6236.
According to the Jewish calendar, it was 5760.
According to the Moslem calendar, it was 1420.
According to the Coptic calendar, it was 1716.
According to the Buddhist calendar, it was 2544.
According to the current Maya great cycle, it was 5119.
According to the calendar of the French Revolution, it was 208.
According to the Chinese calendar, it was the year of the Dragon.

Calendars through the ages - Calendars of the future

"The calendar is intolerable to all wisdom, the horror of all astronomy, and a laughing-stock from a mathematician’s point of view."
Francis Bacon 1267


*365.242199 days in a year today, but how many days will there be in a year tomorrow?

There have been many changes in the way that people look at calendars since they were first introduced on earth. It is not as easy as it might seem to align a time measurement system with the solar year and keep days, hours, and minutes, perfectly in tuned with the movement of the earth around the sun. Thus we can be assured that even more changes will be made to calendars in the future. Imagine living in 45 B.C. when Caesar added 90 extra days to the year, or being around when Gregory XIII erased 10 days in an attempt to remedy the overlapping time. What happened to people’s birthdays, vacations, rent payments and paydays that fell during the erased time? Even today, it is expected that the Gregorian calendar will be inaccurate by a full day in a few millennia, though we can be assured there will be additional reform before that time as well.

The first problem arises due to the fact that there is more than one possible year to measure. The sidereal year, measured as 365.2564 days, is the amount of time it takes for the earth to revolve once around the sun relative to fixed stars. The tropical year, which the Gregorian calendar is based on, is 365.24219 days long. Its measurement is derived from the time it takes for the earth to orbit the sun, relative to the vernal equinox. This makes the sidereal year 20 minutes longer than the tropical year and is therefore out of synch with the seasons. In addition, there are 365.24237 days from one March equinox to the next, 365.24162 days from one June solstice to the next, 365.24201 days from one September equinox to the next, and 365.24274 days between two December solstices (Duncan). Since there are different years to measure and different totals for each year, it is hard to tell which one should be used, and which one is actually more accurate. The accuracy of a measurement becomes increasingly harder to determine when exactly ‘what’ needs to be measured is not known.

In 1972, all that changed. Up until then, time was measured in terms of periodic events like the rotation of the earth around the sun or the swing of a pendulum in a clock. In 1972, the International Bureau of Weights and Measures adopted the Coordinated Universal Time (UTC) model as the official time of the world using the atomic clock as its measuring device. An atomic clock can be used to measure a billionth of a second, or .0000000000114079ths of a year. It bases time upon atomic level oscillations – or movement back and forth between particles – of the metal cesium. Cesium oscillates 9,192,631,770 times per second, eliminating the need to speak in terms of 365 days. Instead we can say that the year has 290,091,200,500,000,000 oscillations. It seems unlikely that such a system will be able to catch on in the vernacular; as humans will continue to say “I’m fifteen minutes late,” even if the pulse of the globe insists that they’re actually millions of oscillations past their scheduled arrival time.

Measuring time in terms of oscillations is actually too accurate, something that the originators of the first calendars probably never dreamed could happen. It does not take into account the spontaneous quaking of the earth on its axis, which can throw seconds out of whack. As a result, leap seconds have been added nearly every year for the past three decades to compensate, though none have ever been subtracted. So it appears that the adage “time slips away from us” is still true.

Suggestions for future reform have already been made. Perhaps the most popular is the Fixed Calendar, in which there would be 13 months. Each would be exactly four weeks long, thus eliminating the current problems found in the fact that weeks cycle through at different times during different years. Not every year starts on a Sunday or Monday, pushing dates around to different days and making quarters for businesses and tax paying confusing. “The extra month, Sol, would come before July. A year day placed at the end of the year would not belong to any week or month. Every four years, a leap day would be added just before July 1st” (“Future”). The quest for accurate time measurement allowed the evolution from stargazing to atomic vibration to take place, and surely the process will continue until man has found a perfect synchronization with the world around him… if such a feat is possible.