It’s not just a proposal, it’s already fully defined and almost completely implemented - I believe they’re just waiting on a standards update from ISO for time zone stuff.
Fun fact: infinities can be different sizes, such that one infinity can be larger than another.
They’re still infinities, with no end. Just of different absolute sizes. Fun stuff to rabbithole down into if you want to melt your brain on a lazy afternoon.
He said I smelled like farts, then I said he did times 10, he replied times a hundred, I pulled out the infinity card, then he replied with times infinity plus one, activating my trap card. I sat him down and for 90 minutes, starting with binary finger counting and Cantor’s diagonalisation argument, I rigorously walked him through infinities and Aleph numbers (only the first 2 in detail, I’m not a monster).
Now he knows the proper retort (not infinity plus one, use Aleph 1). Unfortunately now he’s not sure if numbers are “real” or not because I taught him that natural numbers are the cardinal numbers.
Even more fun: nobody can agree on how many there are (some people say none!), and mathematics is self-consistent regardless of if you assume certain ones definitely doordefinitely don't exist.
You can derive the date by first taking the largest unit, checking if it makes sense, then moving to a smaller time unit iteratively until the date comes out right.
It’s a little out of the ordinary for now, but for thousands of years dates counted upwards from a negative number, which this new method easily avoids.
The definition of the Date object explicitly states that any attempt to set the internal timestamp to a value outside of the maximum range must result in it being set to "NaN". If there's an implementation out there that doesn't do that, then the issue is with that implementation, not the standard.
That's one thing that really bugs me about Javascript (weirdly enough I'm okay with eg prototypal inheritance and how this works, or at least worked before the bolted on classes that were added because apparently I'm like one of the dozen or so people who had no problems with those concepts). The fact that all numbers are floats can lead to a lot of fun and exciting bugs that people might not even realize are there until they suddenly get a weird decimal where they expected an integer
it may or may not be a monday - probably won't. it will be monday based on the (4000 | year) => !(leap year) rule, but by the year 275000 the difference will be so big that i am pretty sure people will make more rules to solve that.
This will be a tough one to fix. There must be millions upon millions of embedded systems out there with 16-bit epoch burned in.
They'll all be much tougher to find than "YEAR PIC(99)" in COBOL was.
Y2K wasn't a problem because thousands upon thousands of programmers worked on it well in advance (including myself) we had source code and plenty of static analysis tools, often homegrown.
The 2038 bugs are already out there...in the wild...their source code nothing but a distant dream.
I honestly don't quite get why it's so common to hate Javascript.
I mean, it's not my favorite language to put it mildly (I prefer type systems that beat me into submission) but as far as popular dynamically typed languages go, it's not nearly the worst offender out there. Yes, lol, weird things equal weird things when you use == but that's not exactly unique among dynamic languages, and some people couldn't come to terms with it not being like Java despite the name so they never bothered learning how prototypal inheritance works, and also who the fuck needed both nullandundefined when either of those by itself is already a mistake and introducing them to a language should be grounds for a nice, solid kick to the groin.
But, warts and all, the implementations are generally reasonably performant as far as these things go, the syntax is recognizable because eg. braces are common whether we like them or not and notably also survives copy-pasting from eg. the internet or anything that doesn't use the same whitespace you do, and it'll happily let you write code in a quite multiparadigm way, leading to some people to insist Javascript is kind of like Scheme and other people to insist Javascript is nothing like Scheme.
So, shit could be worse. And by "shit" and "worse" I mean eg. Python, notable for achievements such as: being one of the first if not the first language with a designer who huffed enough solvents to think that semantically significant whitespace is a great idea especially combined with no real standardization on whether you need to use tabs or spaces, and which often doesn't survive being copy-pasted from the web and is a nightmare to format; being unable to actually run anything in parallel up until very recently because lol why bother with granular locking in the runtime when you can just have one global interpreter lock and be done with it; or being popular in part due to the fact that its FFI makes it easy to write modules for it in languages that aren't a crime against common sense and can run faster and more parallel than an 80's BASIC interpreter. And let's not even go into the whole "virtual environment" thing.
So while Python's not quite INTERCAL-bad, at least INTERCAL doesn't have significant whitespace and its manuals are really damn funny.
And then there's eg. Ruby, with 9999 ways to do everything and all of them so slow that it aspires to one day be as fast as INTERCAL, and PHP which is a practical joke that went too far and somehow managed to eventually convince people it's actually a real language.
edit: oh and if you don't know about INTERCAL, I can highly recommend checking out the the C-INTERCAL revision's manual, which includes eg. a very helpful circuitous diagram and a logical table to explain one of its more odd operators. There's also a resource page that's maintained by one of the perpetrators of the C-INTERCAL revision.