Exactly! Then you write each password down in your notebook of passwords and pat yourself on the back for how hard it would be to compromise all your accounts in one go ;)
Part of the problem is the maintenance cost of a fork just in terms of merging upstream commits.
It won't be long at all before this becomes a huge amount of work for a relatively small divergence of the code. But we could build tools that would make it much less awful!
I write JS, and I have never directly observed the IRs or assembly code that my code becomes. Yet I certainly assume that the compiler author has looked at the compiled output in the process of writing a compiler!
For me the difference is prognosis. Gas Town has no ratchet of quality: its fate was written on the wall since the day Steve decided he didn't want to know what the code says: it will grow to a moderate but unimpressive size before it collapses under its own weight. Even if someone tried to prop it up with stable infra, Steve would surely vibe the stable infra out of existence since he does not care about that
or he will find a way to get the AI to create harnesses so it becomes stable. The lack of imagination and willingness to experiment in the HN crowd is AMAZING me and worrying me at the same time. Never thought a group of engineers would be the most conservative and close minded people I could discuss with.
It's a paradox, huh. If the AI harness became so stable it wrote good code he wouldn't be afraid to look at the code he would be eager to look at it, right? But then if it mattered if AI wrote good code or not he couldn't defend his position that the way to create value with code is quantity over quality. He needs to sell the idea of something only AI can do, which means he needs the system to be made up of a lot of bad or low quality code which no person would ever want to be forced to look at.
Wait till you meet engineers other than sw engineers. Not even sure most sw people should be called engineers since there are no real accredited standards.
I specifically trained as EE in physical electronics because other disciplines at the time seemed really rigid.
There's a saying that you don't want optimists building bridges.
Getting to live by the rules of decency is a privilege now denied us. I can accept that but I don't have to like it or like the people who would abuse my trust for their personal gain.
It is well supported that TFT with a delayed mirroring component and Generous Tit for Tat where you sometimes still cooperate after defection are pretty succesful.
What is written in the Ghostty AI policy lacks any nuance or generosity. It's more like a Grim Trigger strategy than Tit for Tat.
You can't have 1,000,000 abusers and be nuanced and generous to all of them all the time. At some point either you either choose to knowingly enable the abuse or you draw a line in the sand, drop the hammer, send a message, whatever you want to call the process of setting boundaries in anger. Getting a hammer dropped on them isn't going to feel fair to the individuals it falls on, but it's also unrealistic to expect that a mob-like group can trample with impunity because of the fear of being rude or unjust to an individual member of that mob.
It is understanding of these dynamics that lead to us to our current system of law: punitive justice, but forgiveness through pardons.
I wrote several of typescript's initial compilers. We didn't use red/green for a few reasons:
• The js engines of the time were not efficient with that design. This was primarily testing v8 and chakra (IE/edge's prior engine).
• Red/green takes advantage of many things .net provides to be extremely efficient. For example structs. These are absent in js, making things much more costly. See the document on red-green trees I wrote here for more detail: https://github.com/dotnet/roslyn/blob/main/docs/compilers/De...
• The problem domains are a bit different. In Roslyn the design is a highly concurrent, multi-threaded feature-set that wants to share immutable data. Ts/JS being single threaded doesn't have the same concerns. So there is less need to efficiently create an immutable data structure. So having it be mutable meant working well with the engines of the time, with sacrificing too much.
• The ts parser is incremental, and operates very similarly to what I describe in for Roslyn in https://github.com/dotnet/roslyn/blob/main/docs/compilers/De.... However, because it operates on the equivalent of a red tree, it does need to do extra work to update positions and parent pointers.
Tldr, different engine performance and different consumption patterns pushed us to a different model.
I ask because I picked up where the TS and Roslyn teams left off. I actually brought red green trees into JS.
My finding is that the historical reasons against this no longer seem to apply today. With monomorphic code style JS has close enough to structs. Multi threading is now essential for perf.
I don't even think multithreading is the strongest argument for immutability, because it's not only parallelization that immutability unlocks but also safe concurrency, and/or the ability to give trusted data to an untrusted plugin without risking corruption
For that, we have to give control over clients to consumers. In the model of the past the company provides the client and so the client is accountable to the company not the consumer. Only the web browser has ever come close to changing that, but there's not many of us left still fighting for third party clients, even on the web
I don't think it's fair to compare people who are already seemingly at the peak of their career, a place to which they got by building skill in coding. And in fact what they have now that's valuable isn't mostly skill but capital. They've built famous software that's widely used.
Also they didn't adopt the your-career-is-ruined-if-you-don't-get-on-board tone that is sickeningly pervasive on LinkedIn. If you believe that advice and give up on being someone who understands code, you sure aren't gonna write Redis or Django.
reply