ByteCompress

Timestamp Converter

Convert Unix timestamps (seconds or milliseconds since the Unix Epoch) to human-readable date-time strings, and convert dates back to Unix timestamps. Supports UTC and local time zones.

0 chars
FreeClient-sideNo signup

Unix time started counting at January 1, 1970 00:00:00 UTC β€” the Unix Epoch. Every Unix timestamp is the number of seconds (or milliseconds in many modern systems) elapsed since that moment. Timestamps are used throughout programming, databases, and APIs because they are time-zone-independent, sortable as integers, and trivial to compare arithmetically. Reading 1711843200 as a raw number tells you nothing. This converter translates it to 2024-03-31T00:00:00.000Z instantly, using ISO 8601 β€” the international date standard. All conversions run locally in your browser.

How to Use This Tool

  1. Timestamp to Date: Enter a Unix timestamp in seconds or milliseconds β€” the tool shows the date in UTC and your local time zone.
  2. Date to Timestamp: Enter a date and time using the date picker, then copy the resulting Unix timestamp in seconds and milliseconds.
  3. Click Now to fill in the current timestamp for reference.
  4. Use the format selector to display dates in ISO 8601, RFC 2822, or human-readable format.

Understanding Unix Timestamps

Seconds vs Milliseconds

Two conventions are in widespread use. Unix time in seconds (10 digits as of 2024, e.g., 1711843200) is used by POSIX APIs, C/C++ time(), Python's time.time(), and most server-side systems. Unix time in milliseconds (13 digits, e.g., 1711843200000) is used by JavaScript's Date.now(), Java's System.currentTimeMillis(), and many web APIs. Quick rule: 10 digits means seconds; 13 digits means milliseconds. The largest valid 32-bit signed integer timestamp is 2147483647 (seconds), corresponding to January 19, 2038 β€” the Year 2038 problem for legacy systems.

ISO 8601 Format

ISO 8601 is the international standard for date and time representations. The full UTC format is YYYY-MM-DDTHH:mm:ss.sssZ, where T separates date and time and Z denotes UTC (Zulu time). ISO 8601 is the recommended format for API responses and data interchange because it is unambiguous, lexicographically sortable (string order matches chronological order), and parseable in every major programming language.

Example

Timestamp to Date

Input (seconds):       1711843200
UTC date (ISO 8601):   2024-03-31T00:00:00.000Z
RFC 2822:              Sun, 31 Mar 2024 00:00:00 +0000
Human-readable (UTC):  Sunday, March 31, 2024 at 12:00:00 AM UTC

Date to Timestamp

Input:                 2024-03-31 00:00:00 UTC
Unix (seconds):        1711843200
Unix (milliseconds):   1711843200000

Common Timestamp Formats

  • Unix seconds (POSIX): 1711843200 β€” 10 digits, used in C, Python, Go, most system APIs
  • Unix milliseconds: 1711843200000 β€” 13 digits, used in JavaScript, Java, many REST APIs
  • ISO 8601 UTC: 2024-03-31T00:00:00Z β€” recommended for API responses and data storage
  • ISO 8601 with offset: 2024-03-31T03:00:00+03:00 β€” includes time zone offset for unambiguous local time
  • RFC 2822: Sun, 31 Mar 2024 00:00:00 +0000 β€” used in email headers and HTTP Date header
  • Unix Epoch reference: 0 = 1970-01-01T00:00:00Z

Best Practices

  • Store timestamps in UTC in databases β€” convert to local time only at the display layer
  • Use ISO 8601 for API responses β€” it is unambiguous and parseable in all languages
  • Prefer milliseconds over seconds for new systems to avoid floating-point arithmetic with sub-second precision
  • Migrate 32-bit timestamp storage to 64-bit to avoid the Year 2038 overflow problem

For unique identifiers with embedded timestamps, UUID v7 (RFC 9562) combines Unix millisecond timestamps with random bits for sortable, unique IDs β€” use the UUID Generator for UUID v4 when sort order is not required. For debugging JSON responses containing timestamps, the JSON Formatter makes nested timestamp fields readable.

Frequently Asked Questions

What is a Unix timestamp?

A Unix timestamp is the number of seconds elapsed since the Unix Epoch β€” midnight (00:00:00 UTC) on January 1, 1970 β€” excluding leap seconds. It is time-zone-independent, a simple integer, and easy to compare and sort. Unix timestamps are the universal internal representation for time in operating systems, databases, and programming languages.

How do I tell if a timestamp is in seconds or milliseconds?

Count the digits. As of 2024, Unix timestamps in seconds are 10 digits (e.g., 1700000000). Timestamps in milliseconds are 13 digits (e.g., 1700000000000). If the number is less than about 10 billion, it is likely seconds. Nanosecond-precision timestamps (used in some high-frequency systems) are 19 digits.

What is the Year 2038 problem?

Systems storing Unix timestamps as 32-bit signed integers can only represent values up to 2,147,483,647. That maximum corresponds to January 19, 2038 at 03:14:07 UTC. Adding one second after that causes integer overflow to a large negative number, breaking date calculations. Modern 64-bit systems are safe until approximately 292 billion years from now.

Does the converter use my local time zone?

Yes. The tool displays both UTC and your local time. Local time is determined by your browser's time zone setting, which reflects your operating system's configured time zone. If your system clock is misconfigured or you are traveling, the local time display reflects that setting.

How do I get the current Unix timestamp in code?

JavaScript: Math.floor(Date.now() / 1000) for seconds, or Date.now() for milliseconds. Python: import time; int(time.time()). Go: time.Now().Unix() for seconds, time.Now().UnixMilli() for milliseconds. PostgreSQL: EXTRACT(EPOCH FROM NOW()). Bash: date +%s.