JS is hated but if you compile to browser JS that code will run in 2100. If you mainly deal with files / blobs not databases you will have these things in 2100 too. I think a lot of apps can be JS plus Dropbox integration to sync files. Dropbox may rot but make that a plugin (seperate .js file) and offer local read/write too and I think you'd be pretty future proof.
Except of course software rot and javascript code bases go hand in hand.
You seem to assume, browsers have stopped changing and will be more or less the same 75 years from now.
I think you are right that that code might run. But probably in some kind of emulator. In the same way we deal with IBM mainframes right now. Hardware and OS have long since gone the way of the dodo. But you can get stuff running on generic linux machines via emulation.
I think we'll start seeing a lot of AI driven code rot management pretty soon. As all the original software developers die off (they've long been retired); that might be the only way to keep these code bases alive. And it's also a potential path to migrating and modernizing code bases.
Maybe that will salvage a few still relevant but rotten to the core Javascript code bases.
If they want to introduce html6, they'll add a new doctype. If they want moderner JS they'll do another 'stricter' or file type or what have you. The fact that it will continue to run with zero changes is good enough.
Things like E4X, sharp variables, and array comprehensions have already been removed; it's just that the mass of newer developers mean the average doesn't know about them. Unfortunately it's not like they never remove things.
As far as I'm aware none of those were ever supported in more than one engine (specifically Firefox) and so cannot reasonably be considered to have been part of JavaScript. JS really does make a point of not removing things.
There are some _very_ rare exceptions, but they're things like "support for subclassing TypedArrays", and even then this is only considered after careful analysis to ensure it's not breaking anyone.
I didn't know about either of those, and I've been using js for 20ish years. I could have used those sharp variables in a previous project! Fortunately it's not terribly hard to do the same thing with an IIFE. Or get() method.
You are right that your javascript bundle will probably run forever as-is. However, three years later your toolchain will be totally broken and now you are in NPM Hell trying to fix it. Ten years later, good luck.
(S3 better example than Dropbox. That will mostly be around forever.)
I've been debating how to go abouts installing js packages directly into my project. Like when I update a dependency, show me the diff for that whole project, not just my package.json bump. If it's good, I'll just accept the merge as-is. If it's bad, it can stay pinned to that version forever. No downloads. I guess this doesn't solve the "peer dependency" issue if 2 different things depend on 1 thing in a way such that I can't have 2 copies of the lib (eg react 18 and 19 is a nono but 2 lodashes is fine). Hmm.. a man can dream.
Don't bundle your JS then, problem solved. Vanilla JS and CSS written 20 years ago still works except maybe for a couple of obscure deprecated things.
Rot is directly proportional to the amount of dependencies. Software made responsibly with long term thinking in mind has dramatically less issues over time.
Postgres will get disassembled into independent composable parts and some other "distribution" of it will be used for a more narrow set of use cases that actually require running the database as a standalone process