Unix Timestamp Converter
Convert between Unix timestamps and human-readable dates.
Seconds or milliseconds — auto-detected
How It Works
A Unix timestamp is the number of seconds elapsed since the Unix epoch — midnight on January 1, 1970 UTC. It is the standard way to represent time in most programming languages, APIs, databases, and log files because it is timezone-independent, always increasing, and trivially comparable.
Many modern systems (JavaScript, Java) use milliseconds instead of seconds. This tool auto-detects the unit: inputs with 13 or more digits are treated as milliseconds, shorter inputs as seconds.
Note: 32-bit signed integers overflow at timestamp 2147483647 (January 19,
2038), known as the Y2K38 problem. Systems that store timestamps as 32-bit
integers will fail at that point. 64-bit integers (used by most modern systems) are safe for
hundreds of billions of years.