Convert any Unix timestamp to a readable date and time across every time zone. Originated at Bell Labs in New Jersey — the birthplace of Unix.
Unix time is the number of seconds elapsed since 00:00:00 UTC on January 1, 1970. It was created at Bell Labs in Murray Hill, New Jersey — the same lab that gave the world the transistor, C language, and Unix itself.
Today Unix timestamps power everything from POSIX-compliant operating systems to REST APIs and real-time financial feeds on Wall Street.
Other timestamp units are powers-of-ten multiples of the base second value. Converting to local time shifts by the UTC offset.
Where Δ_UTC is the offset in hours (e.g. −5 for EST, −8 for PST). DST entries show fixed offsets — actual transitions depend on region and year.
Unix time was conceived around 1969 at Bell Labs, NJ. The epoch (timestamp 0) = 1970-01-01 00:00:00 UTC = 1969-12-31 07:00 PM EST — it was still New Year's Eve in New York.
Embedded systems in US power grids, water treatment plants, and older SCADA systems that store time as 32-bit integers will overflow on 2038-01-19 at 03:14:07 UTC (10:14 PM EST the night before).
US financial regulations (FINRA, SEC Rule 613) require equity trade timestamps accurate to at least 1 millisecond. High-frequency trading firms in New York use nanosecond-precision Unix clocks.
Most US-based API platforms (GitHub, Stripe, Twilio) return rate-limit reset times as Unix timestamps in the X-RateLimit-Reset HTTP header — always in seconds.
Type a Unix timestamp into the input field, or click "Use current time" to load the current epoch second.
Select the unit that matches your timestamp: seconds, milliseconds, microseconds, or nanoseconds.
The reference time box shows your timestamp expressed in all four units simultaneously.
Scroll through the five timezone groups to find any US time zone (EST, CST, MST, PST, AKST, HST) and read the local date and time.
Copy any value by selecting the text in the table cell.
Bell Labs engineers chose a round date that was recent during development (1969–71). The midnight UTC boundary made integer arithmetic simple.
32-bit signed integers overflow at 2,147,483,647 — corresponding to 2038-01-19 03:14:07 UTC. All modern 64-bit systems extend the range by hundreds of billions of years.
Use new Date(timestamp * 1000) for seconds, or new Date(timestamp) for milliseconds. Date.now() always returns milliseconds.
No. Each day is treated as exactly 86,400 seconds. Leap seconds are smeared, so Unix time and UTC can differ by a few seconds.
Use time.time() for a float, int(time.time()) for an integer, or datetime_obj.timestamp() for a datetime object.
Practically the same offset. GMT is the historical meridian-based zone; UTC is the atomic-clock standard used in computing.