Timekit

Developer tools for timestamps and dates

Unix Timestamp: Seconds vs Milliseconds

Quick Reference

Seconds (10 digits)

1704067200

Standard Unix timestamp. Number of seconds since Jan 1, 1970 UTC.

Milliseconds (13 digits)

1704067200000

JavaScript/Java style. Number of milliseconds since Jan 1, 1970 UTC.

How to Identify the Format

The easiest way to identify timestamp format is by counting digits:

  • 10 digits = seconds (values from ~1,000,000,000 to ~2,000,000,000)
  • 13 digits = milliseconds (values from ~1,000,000,000,000 to ~2,000,000,000,000)

Current timestamps (2020-2030) in seconds range from about 1.6 billion to 1.9 billion. In milliseconds, they range from about 1.6 trillion to 1.9 trillion.

Common Mistakes

Treating milliseconds as seconds

If you interpret 1704067200000 as seconds, you get a date in the year ~55987. Always check the digit count!

Treating seconds as milliseconds

If you interpret 1704067200 as milliseconds, you get January 20, 1970 - just 20 days after the epoch.

Converting Between Formats

Seconds to Milliseconds

milliseconds = seconds * 1000

Milliseconds to Seconds

seconds = Math.floor(milliseconds / 1000)

Language Defaults

Language/PlatformDefault UnitExample
JavaScriptMillisecondsDate.now()
PythonSecondstime.time()
JavaMillisecondsSystem.currentTimeMillis()
Unix shellSecondsdate +%s
PostgreSQLSecondsEXTRACT(EPOCH FROM ...)
GoSecondstime.Now().Unix()

Unix timestamps can be represented in seconds or milliseconds, which often leads to confusion and bugs. Misinterpreting the unit can result in incorrect dates far in the future or past.

This page explains the difference between second-based and millisecond-based timestamps, how to identify each format, and how to convert between them correctly.

Use this guide when debugging timestamp issues, validating API responses, or working with systems that use different timestamp precisions.