Loading calculator...
Loading calculator...
Convert Unix timestamps to human-readable dates and times, or convert any date and time to a Unix timestamp. Supports UTC, local time, and ISO 8601 formats.
Unix Timestamp:
A Unix timestamp counts seconds elapsed since January 1, 1970 00:00:00 UTC (Unix Epoch).
Millisecond timestamps (13 digits) are automatically detected.
A Unix timestamp (also called POSIX time or Epoch time) is the number of seconds that have elapsed since January 1, 1970 at 00:00:00 UTC. It is a common way to represent time in computing because it is a single integer independent of time zones.
Standard Unix timestamps are in seconds (10 digits for dates after 2001). JavaScript and many modern systems use millisecond timestamps (13 digits), which are 1000 times larger. This converter detects 13-digit values automatically.
ISO 8601 is an international standard for representing dates and times. A typical ISO 8601 string looks like 2024-03-15T14:30:00.000Z, where Z indicates UTC. It is unambiguous and widely used in APIs and data interchange.
The maximum 32-bit signed Unix timestamp is 2,147,483,647, which corresponds to January 19, 2038. This is known as the Year 2038 problem. Most modern systems use 64-bit integers, which extend the range billions of years into the future.
Unix time counts every day as exactly 86,400 seconds, ignoring the occasional leap second inserted to keep UTC in sync with Earth's rotation. This simplification makes arithmetic easier but means Unix time is not perfectly synchronized with UTC during and after leap seconds.
JavaScript: Math.floor(Date.now() / 1000). Python: import time; int(time.time()). PHP: time(). MySQL: UNIX_TIMESTAMP(). Go: time.Now().Unix(). All return seconds since the Unix Epoch.