Skip to main contentSkip to navigationSkip to searchSkip to footer

Unix Timestamp Converter — Epoch to Date, Date to Timestamp (2026)

Convert Unix timestamps to readable dates instantly — and dates back to epoch. Covers seconds vs milliseconds, JavaScript, Python, SQL code, timezone handling, and the year 2038 problem.

NextUtils Team
9 min read
📚Tutorials
unix-timestampepochdatetimedeveloper-toolsconverters
Developer tools and productivity experts

Quick Answer

  1. Paste your Unix timestamp (10-digit seconds value) into the converter and click Convert.
  2. Read the output: UTC time, local time, ISO 8601 string, and relative time (e.g. “3 hours ago”).
  3. Reverse it: pick a date and time in the date field and click Convert to get the epoch value.

Unix Timestamp Converter — Free

Paste a timestamp or pick a date — get UTC, local, ISO 8601, and relative time instantly. Runs in your browser, nothing sent to a server.

Open Timestamp Converter →

You're reading an API response, a database log, or a JWT token, and you see a number like 1778803200. That's a Unix timestamp — a count of seconds since midnight on January 1, 1970 UTC. Perfectly unambiguous for machines, completely opaque to humans.

This guide covers everything you need: how to convert timestamps in your browser, in code (JavaScript, Python, SQL), how to tell seconds from milliseconds, timezone handling, and the Year 2038 overflow problem that still affects legacy systems.

What Is a Unix Timestamp?

A Unix timestamp — also called epoch time, POSIX time, or Unix time — is a single integer that represents a specific moment in time as the number of seconds elapsed since the Unix epoch: January 1, 1970, 00:00:00 UTC.

For example, the timestamp 0 is exactly the epoch itself. The timestamp 86400 is exactly 24 hours later (86,400 seconds in a day). The timestamp 1778803200 is May 2026.

The reason this format exists is elegance: a single integer is universal. There's no ambiguity about timezones, no locale-specific formatting, no month/day ordering confusion. Every system that understands Unix timestamps agrees on what any given number means.

Why 1970? Unix engineers chose January 1, 1970 in the early 1970s because it was close to when Unix was being developed — recent enough to avoid wasting bits on ancient history, far enough back to cover pre-Unix dates. The choice was somewhat arbitrary but has become a universal computing standard.

Seconds vs. Milliseconds: 10-Digit vs. 13-Digit

The single most common source of timestamp confusion is the seconds/milliseconds split. Two conventions exist side-by-side, and mixing them up produces wildly wrong dates.

FormatDigitsExampleUsed by
Seconds101778803200POSIX, Unix, Python time.time(), most databases, JWT
Milliseconds131778803200000JavaScript Date.now(), Java, many REST APIs

The fix is simple: if you have a 13-digit timestamp and need seconds, divide by 1,000. If you're unsure which you have, paste it into the converter above — it auto-detects the format.

Classic bug: Passing a 13-digit milliseconds timestamp to a function that expects seconds gives a date in the year 57,000+. Passing a 10-digit seconds timestamp to a function expecting milliseconds gives a date in early January 1970. Both are unmistakable once you know what to look for.

Converting Timestamps in Code

The online converter handles one-off lookups. For programmatic conversion in your codebase, here are the correct patterns for the most common languages.

JavaScript / Node.js

// Current timestamp in seconds
const seconds = Math.floor(Date.now() / 1000);
// → 1778803200

// Current timestamp in milliseconds (JS default)
const millis = Date.now();
// → 1778803200000

// Seconds timestamp → readable date
const ts = 1778803200;
const date = new Date(ts * 1000);          // multiply by 1000 — JS uses ms
console.log(date.toISOString());           // "2026-06-03T00:00:00.000Z"
console.log(date.toUTCString());           // "Wed, 03 Jun 2026 00:00:00 GMT"

// Milliseconds timestamp → readable date
const tsMs = 1778803200000;
const date2 = new Date(tsMs);              // no multiplication needed
console.log(date2.toISOString());

// Date → seconds timestamp
const timestamp = Math.floor(new Date('2026-06-03').getTime() / 1000);

Python

import datetime, time

# Current timestamp in seconds
timestamp = int(time.time())
# → 1778803200

# Seconds timestamp → datetime (local timezone)
ts = 1778803200
dt_local = datetime.datetime.fromtimestamp(ts)
print(dt_local)  # 2026-06-03 00:00:00 (local)

# Seconds timestamp → datetime (UTC)
dt_utc = datetime.datetime.utcfromtimestamp(ts)
print(dt_utc)    # 2026-06-03 00:00:00 (UTC)

# ISO 8601 format
print(dt_utc.isoformat() + 'Z')  # "2026-06-03T00:00:00Z"

# Datetime → seconds timestamp
dt = datetime.datetime(2026, 6, 3, 0, 0, 0)
ts_back = int(dt.timestamp())

SQL

-- MySQL: timestamp → readable date
SELECT FROM_UNIXTIME(1778803200);
-- → 2026-06-03 00:00:00

-- MySQL: date → timestamp
SELECT UNIX_TIMESTAMP('2026-06-03 00:00:00');
-- → 1778803200

-- PostgreSQL: timestamp → timestamptz
SELECT to_timestamp(1778803200);
-- → 2026-06-03 00:00:00+00

-- PostgreSQL: now → epoch
SELECT EXTRACT(epoch FROM now())::bigint;

-- SQLite: timestamp → readable (SQLite stores as integer)
SELECT datetime(1778803200, 'unixepoch');
-- → 2026-06-03 00:00:00

Excel / Google Sheets

Spreadsheets don't have a native epoch function, but the math is straightforward. Excel's date system counts days from January 1, 1900; Unix counts seconds from January 1, 1970 — a difference of 25,569 days.

# Google Sheets (seconds in A1)
=(A1/86400)+DATE(1970,1,1)
# → format the result cell as a date

# For milliseconds in A1
=(A1/1000/86400)+DATE(1970,1,1)

# Google Sheets also has a native function:
=EPOCHTODATE(A1)       # seconds
=EPOCHTODATE(A1, 1)    # milliseconds

Timezone Handling

Unix timestamps are always UTC. The number 1778803200 represents the same instant whether you read it in New York, London, or Tokyo. There is no timezone embedded in the value itself.

Timezone only enters the picture at the display layer. When you convert a timestamp to a human-readable string for a user, you apply their local timezone offset. The timestamp stored in your database never changes — only the presentation does.

PatternVerdict
Store all timestamps as UTC integers in the database✓ Correct
Convert to local time only when rendering for the user✓ Correct
Store a local time as a number and treat it as UTC later✗ Bug
Apply a timezone offset twice (once on write, once on read)✗ Double-conversion bug

The Year 2038 Problem

On January 19, 2038 at 03:14:07 UTC, 32-bit systems storing Unix timestamps as signed integers will overflow. The maximum value of a signed 32-bit integer is 2,147,483,647. One second later, the counter wraps to a large negative number — which, when interpreted as a timestamp, produces a date in December 13, 1901.

Modern operating systems, databases, and languages have migrated to 64-bit integers, which extends the valid range to hundreds of billions of years in either direction. But older embedded systems — industrial controllers, some medical devices, legacy automotive firmware, older IoT sensors — may still use 32-bit time storage and will need patching before 2038.

Check your system: If you're writing code that stores timestamps in a 32-bit integer type (e.g., int in C on a 32-bit platform, or MySQL's old INT timestamp column), migrate to BIGINT or int64_t before 2038. PostgreSQL's TIMESTAMP type is already 64-bit and is not affected.

Common Debugging Scenarios

My converted date shows January 1, 1970

A result of January 1, 1970 means the timestamp value is 0 — or null/undefined being coerced to 0. Check that the timestamp is being read correctly from the API response or database column before converting.

My converted date shows year 57,000+

You passed a milliseconds timestamp to a function expecting seconds. Divide by 1,000 and try again.

My date is off by exactly 1, 2, or N hours

This is a timezone offset error. The timestamp itself is correct; the rendering is applying the wrong timezone. Verify you're converting to UTC at the display layer and not using a local system clock offset where UTC is expected.

Negative timestamps — dates before 1970

Unix timestamps can be negative, counting backward from the epoch. The value -86400 is December 31, 1969. The Apollo 11 moon landing on July 20, 1969 is approximately -14182940. On 64-bit systems, negative timestamps work correctly — 32-bit systems are limited to December 13, 1901 as the earliest representable date.

Useful Timestamp Reference Values

TimestampHuman-readable date (UTC)Notes
01970-01-01 00:00:00Unix epoch origin
864001970-01-02 00:00:00Exactly 1 day (86,400 seconds)
10000000002001-09-09 01:46:401 billion seconds milestone
17788032002026-05-15 00:00:00Publication date of this article
20000000002033-05-18 03:33:202 billion seconds milestone
21474836472038-01-19 03:14:0732-bit signed integer max (Y2K38)

Decode Any Timestamp in Seconds — Free

Paste any 10-digit or 13-digit timestamp and get UTC, local time, ISO 8601, and relative time instantly. Convert dates to epoch values too — all in your browser.

Open Timestamp Converter →

Try These Free Tools

Share this article

Related Articles

Continue exploring with these related posts

Ready to try our tools?

Explore our collection of free online tools for developers, designers, and power users.

Explore All Tools

Explore More Tools

Discover our collection of free online tools for developers, designers, and power users