The Unix epochalypse — or "Y2038" — is the moment when a 32-bit signed integer counting seconds since 1970 overflows. Systems still using 32-bit time will wrap to a negative value and interpret the date as 1901. Here's what you actually need to know.
2147483647 seconds since 1970-01-01 UTC
= 2038-01-19 03:14:07 UTC
At 2038-01-19 03:14:08 UTC, a signed 32-bit integer can no longer represent the elapsed seconds. It overflows from 2,147,483,647 to -2,147,483,648, which (interpreted as time) is 1901-12-13 20:45:52 UTC.
For programs that pass timestamps to system calls, store them in 32-bit columns, or transmit them over wire protocols using 32-bit fields, the failure modes range from "shows the wrong date" to "complete crash" to "data corruption." Anywhere time is stored as a signed 32-bit integer, that storage will fail.
The bad news: it's not just things that exist in 2038. The problem already affects systems that calculate future dates more than 13 years out — anything computing a 25-year mortgage maturity, a long-term backup retention policy, or a multi-decade certificate expiration will hit Y2038 sooner than 2038 itself.
Major categories at risk:
time_t type. Newer 64-bit systems are fine.INTEGER or INT instead of BIGINT when storing Unix timestamps.The good news: most modern infrastructure is already fine.
BIGINT or native TIMESTAMP typesIn SQL, any column storing epoch seconds should be at least 64-bit:
-- Good
CREATE TABLE events (
ts BIGINT NOT NULL -- 64-bit, safe through year 292 billion
);
-- Bad
CREATE TABLE events (
ts INT NOT NULL -- 32-bit, overflows 2038-01-19
);
Most modern languages are fine, but check:
time_t is typically long. On 64-bit Linux/macOS/Windows it's 64-bit. On 32-bit embedded, it's 32-bit.System.currentTimeMillis() returns long (64-bit). Safe.Date.now() is a Number with 53-bit integer precision. Safe through year ~285616.time.Time uses int64. Safe.SystemTime uses platform time_t; chrono uses i64. Mostly safe.If you serialize timestamps over the wire or to disk, check the field width. Anything you control should use 64-bit. Anything you don't (third-party APIs, file formats, vendor protocols) needs explicit verification.
This converter actively warns when you input a timestamp within ~1 year of the 2038 boundary, so you spot accidental 32-bit math early. It doesn't fix the underlying problem — that's your code's job — but it does surface the issue while you're still debugging.
Y2038 is far enough away that there's no panic, but close enough that ignoring it is irresponsible for any system that will still be running by then. If you maintain anything that handles timestamps, audit it.