How To Work With Unix Timestamps
📖 Bu rehber ToolPazar ekibi tarafından hazırlanmıştır. Tüm araçlarımız ücretsiz ve reklamsızdır.
What the unix epoch is
On January 19, 2038 at 03:14:07 UTC, the unix timestamp reaches 2,147,483,647 — the maximum value of a signed 32-bit integer. One second later, a 32-bit signed timestamp overflows to −2,147,483,648, which represents December 13, 1901. Any system still using 32-bit signed time will read the date as 1901.
Seconds vs milliseconds
A signed 32-bit integer can represent dates from 1901-12-13 to 2038-01-19. An unsigned 32-bit integer can represent 1970-01-01 to 2106-02-07 but cannot represent any pre-1970 date. Most languages default to signed, which is why you see 2038 mentioned more than 2106. Some older systems (certain C APIs, some databases) use unsigned — meaning a historical date like a birth date in 1955 simply cannot be stored.
The Year 2038 problem
To display a unix timestamp to a human, apply a timezone offset. UTC has offset +00:00. New York in winter is −05:00, in summer −04:00 (daylight saving). Tokyo is +09:00 year-round. Converting from timestamp to local time:
Signed vs unsigned
Between 1972 and 2024, 27 leap seconds have been added. The General Conference on Weights and Measures voted in 2022 to effectively abandon leap seconds by 2035.
Timezones and offsets
Excel’s date serial has a famous bug: it incorrectly treats 1900 as a leap year, so dates before March 1900 are off by one. This was preserved for Lotus 1-2-3 compatibility and can never be fixed.