61

Why does Unix time start at 1970-01-01? Why not 1971-01-01 or any other date?

muru
  • 72,889
Templar
  • 823

4 Answers4

40

I wouldn't have known the answer except google was there for me:

From Here (needs free subscription):

Linux is following the tradition set by Unix of counting time in seconds since its official "birthday," -- called "epoch" in computing terms -- which is Jan. 1, 1970.

A more complete explanation can be found in this Wired News article. It explains that the early Unix engineers picked that date arbitrarily, because they needed to set a uniform date for the start of time, and New Year's Day, 1970, seemed most convenient.

slm
  • 369,824
Hanan
  • 5,771
11

Unix isn't born in 1970.

From Wired - Unix Tick Tocks to a Billion:

The Unix epoch is midnight on January 1, 1970. It's important to remember that this isn't Unix's "birthday" -- rough versions of the operating system were around in the 1960s. Instead, the date was programmed into the system sometime in the early '70s only because it was convenient to do so, according to Dennis Ritchie, one of the engineers who worked on Unix at Bell Labs at its inception.

Greenonline
  • 1,851
  • 7
  • 17
  • 23
Danny A
  • 111
  • 1
  • 2
  • 1
    Convenient back then, inconvenient for developers the world over ever since. – Chris Halcrow May 10 '18 at 06:13
  • 5
    @ChrisHalcrow: what would you have chosen as time 0 if you were dmr? And how is the choice inconvenient for developers? The inconvenience is because measuring time in "human" terms (years, months, days, hours/minutes/seconds, time zones, daylight time) is complicated, not because some (arbitrary) $t=0$ instant was chosen. – NickD Sep 06 '19 at 13:37
  • 1
    @NickD, good prompt for an explanation and good point! I would choose 00:00:00 of CE 0 though, as I'm confident that would make things a little easier to calculate . Please explain what is 'dmr'? Also, ironically, the fact that the OP requires an explanation of why this date was chosen shows that it is inherently confusing for someone to understand the usage of 01/01/70 as a reference date! – Chris Halcrow Sep 10 '19 at 00:25
  • 6
    @ChrisHalcrow: dmr = Dennis Ritchie. Did you calculate the number of seconds from your chosen origin to today? How many bits does it require? The PDP-11 had 16-bit registers and words, but it allowed you to group together two registers and two words to make 32-bit registers and double-words for some operations. That gives you +/- 68 years from your 0 time (or +136 years if your time was unsigned - but dmr chose signed). His choice may be a bit mystifying the first time you see it, but it's a pretty obvious decision given the above... – NickD Sep 10 '19 at 01:29
  • 1
    @NickD - great explanation! This should be part of the accepted answer - why not move your comment there, and we can delete ours from here? – Chris Halcrow Sep 10 '19 at 04:08
  • @NickD: I would have chosen a day which immediately follows February 29. so that date calculations would only need to special-case dates which were congruent (mod 1461) to either zero or 1460 and could otherwise ignore leap years. – supercat Jul 31 '23 at 21:46
10

I like the question :-)

Let me attempt to answer it (ofcourse source: internet)

Unix Time is represented by a 32 bit whole number (an integer) that can be positive or negative (signed). Unix was originally developed in the 60s and 70s so the "start" of Unix Time was set to January 1st 1970 at midnight GMT (Greenwich Mean Time) - this date/time was assigned the Unix Time value of 0. This is what is know as the Unix Epoch.

A 32 bit signed integer can represent whole numbers between -2147483648 and 2147483647. Since Unix Time starts at 0, negative Unix Time values go back in time from the Epoch and positive numbers go forward in time. This means that Unix Time spans from Unix Time value of -2147483648 or 20:45:52 GMT on December 13th 1901 to Unix Time value of 2147483647 or 3:14:07 GMT on January 19 in 2038. These dates represent the beginning, the pre-history and the end of Unix Time.

The end of Unix Time will occur on January 19, 2038 03:14:07 GMT. On January 19, 2038 03:14:08 GMT all computers that still use 32 bit Unix Time will overflow. This is known as the "Year 2038 problem". Some believe this will be a more significant problem than the "Year 2000 problem". The fix for the Year 2038 problem is to store Unix Time in a 64 bit integer. This is already underway in most 64 bit Operating Systems but many systems may not be updated by 2038.

  • 7
    Only one paragraph of this actually addresses the question, and it's somewhat inaccurate (the epoch was originally in 1971; it was moved later) – Michael Mrozek Dec 06 '11 at 20:33
  • 1
    Also see http://en.wikipedia.org/wiki/Unix_time – Nikhil Mulley Dec 06 '11 at 20:34
  • 2
    Thats Right Michael. From Wikipedia: The earliest versions of Unix time had a 32-bit integer incrementing at a rate of 60 Hz, which was the rate of the system clock on the hardware of the early Unix systems. The value 60 Hz still appears in some software interfaces as a result. The epoch also differed from the current value. The first edition Unix Programmer's Manual dated 3 November 1971 defines the Unix time as "the time since 00:00:00, Jan. 1, 1971, measured in sixtieths of a second". – Nikhil Mulley Dec 06 '11 at 20:35
  • @Nikhil I still don't get why 1970, only because Unix was developed that time? Why not 1960? or different month different day? – Templar Dec 06 '11 at 20:41
  • @Nikhil or it doesn't really matter? Just first month first day looks better and it was made in 1971 so 1970 would look better too? – Templar Dec 06 '11 at 20:44
  • @Templar, I guess the authority/consensus lead to the opinion from Dennis that he would like to have dates/year that are recorded in the computer ranging his lifetime(his birthday is in Sep, 1941). He wanted probably to cover his 100yrs of life recorded in Unix time and it could be from 01/01/1941 if it had to be, but if it does, it would not cover Sep 1971. So, I guess his take was to go for a signed 32-bit integer that could as well record the prior times of 1971. – Nikhil Mulley Dec 06 '11 at 20:50
  • @Templar, there were technical limitations with the clock frequencies incrementing at 60Hz. The User Manual also commented that "the chronologically-minded user will note that 232 sixtieths of a second is only about 2.5 years". Because of this limited range, the epoch was redefined more than once, before the rate was changed to 1 Hz and the epoch was set to its present value. This yielded a range in excess of 130 years, though with more than half the range in the past (using signed data type) – Nikhil Mulley Dec 06 '11 at 20:52
  • I am sure this is understood in bits n pieces and needs some reconstruction to make it understand in a flow. – Nikhil Mulley Dec 06 '11 at 20:55
1

Quoting my answer from SE.Retrocomputing, Why is the Unix epoch January 1st 1970?:


This comment from JdeBP piqued my interest:

Psst! Dennis Ritchie is on the record about this, to Poul-Henning Kamp, Warren Toomey, and Wired. Warner Losh has also reported on this... Find out what dmr actually told people about this.

Therefore, here is Dennis Ritchie's comment about this, as well as a brief expanation of the overflow that he mentions.

From Wired - Unix Tick Tocks to a Billion,

The Unix epoch is midnight on January 1, 1970. It's important to remember that this isn't Unix's "birthday" -- rough versions of the operating system were around in the 1960s. Instead, the date was programmed into the system sometime in the early 70s only because it was convenient to do so, according to Dennis Ritchie, one [of] the engineers who worked on Unix at Bell Labs at its inception.

"At the time we didn't have tapes and we had a couple of file-systems running and we kept changing the origin of time," he said. "So finally we said, 'Let's pick one thing that's not going to overflow for a while.' 1970 seemed to be as good as any."

There are approximately 32 millions seconds in a year, which means that it takes about 31 years for a billion seconds to pass. Apparently, earlier this year, some mathematically inclined provocateurs discovered that the year 2001 marked 31 years since 1970, and some of them assumed that this might represent an "overflow" -- the date buffer filling with digits, causing the computer to go wacky.

In addition, and with a bit more historical detail, Warner Losh stated in an email, Re: [TUHS] The 2038 bug..., on 4 Jan 2021:

My understanding is that it's been 1st Jan 1970 since at least Ed5, if not Ed6.

It's been that way since the 4th edition.

In the 3rd edition it was the number of 60Hz ticks since 1972, along with this note: "This guarantees a crisis every 2.26 years."

Rebasing the epoch would be... tricky... lots of math is done assuming an origin of 1970, and not all of it is obvious to even concerted analysis.

Less ugly would be to declare time_t to be unsigned instead of signed... It would break less code... Making time_t 64 bits also breaks code, even if you declare you don't care about binary compat since many older apps know time_t is 32-bits.

Warner

Notable dates

  • V1 released 1972
  • V2 released June 1972
  • V3 released February 1973
  • V4 released November 1973
  • V5 released June 1974
  • V6 released May 1975
  • V7 released January 1979
  • V8 released 1985
  • ...
Greenonline
  • 1,851
  • 7
  • 17
  • 23