FixThatAppAll Tools
Developer

Unix Timestamp Converter

Convert unix timestamps to human-readable dates and back.

How This Tool Works

The Unix Timestamp Converter translates between human-readable dates and Unix timestamps. A Unix timestamp is the number of seconds (or milliseconds) that have elapsed since the Unix epoch: January 1, 1970, 00:00:00 UTC. This format is used throughout programming, APIs, databases, and log files because it is timezone-independent, easy to compare, and simple to do arithmetic on. The current Unix time is always increasing. JavaScript uses milliseconds (multiply/divide by 1000 to convert to/from seconds); most other systems use seconds.

How to Use

  1. To convert a Unix timestamp to a date: enter the timestamp in field A (e.g. 1711843200) and click Run.
  2. To convert a date to Unix timestamp: enter a date string in field A (e.g. 2024-03-31) and click Run.
  3. If you get a date in 1970, your timestamp is in milliseconds — divide by 1000 first.
  4. All output is in UTC. Add your timezone offset manually for local time.

Common Questions

Why does JavaScript use milliseconds while most systems use seconds?

JavaScript's Date object uses milliseconds since epoch by default (Date.now() returns ms). Most Unix systems, APIs, and databases use seconds. Always check which unit an API expects — sending milliseconds to a seconds-based API will produce dates in 2554.

What is the Year 2038 problem?

32-bit integers can store values up to 2,147,483,647. As a Unix timestamp, this corresponds to January 19, 2038, 03:14:07 UTC. Systems using 32-bit integers for timestamps will overflow on that date. Modern systems use 64-bit integers which won't overflow for billions of years.

What does timestamp 0 represent?

Unix timestamp 0 is exactly January 1, 1970, 00:00:00 UTC — the Unix epoch. Negative timestamps represent dates before 1970.