Get the latest tech news
We'd be better off with 9-bit bytes
Better Off with 9-bit Bytes A number of 70s computing systems had nine-bit bytes, most prominently the PDP-10, but today1 [1 Apparently, it was the System/360 that really set the standard here.] all systems use 8-bit bytes and that now seems natural.2 [2 Though you still see RFCs use "octet", and the C standard has a CHAR_BITS macro, to handle the possibility of a different-sized byte.] As a power of two, eight is definitely nicer. But I think a series of historical coincidences would actually go our way with 9-bit bytes.
When exhaustion does set in, it would plausibly at a time where there's not a lot of growth left in penetration, population, or devices, and mild market mechanisms instead of NATs would be the solution. Negative timestamps would represent any time since 882, so could cover the founding of Kievan Rus', the death of Alfred the Great, the collapse of the Classic Maya, 5 The people stuck around, but they stopped building cool cities. Server-class machines would still need to address more memory than that, but they're usually running specialized software or virtualizing; databases and hypervisors are already tricky code and segmentation wouldn't be the end of the world.
Or read this on Hacker News