I know why typescript "succeeded", but always wonder what kind of web we would have if infact Haxe had become more popular for web in the early days. My guesstimate is we would have had bundlers in native code much, much earlier, and generally much faster and more robust tooling. Its only now we see projects like esbuild, and even TS being written in a compiled language (go), and other efforts in rust.
Also it would have been interesting sto ser what influence Haxe would have had on javascript itself
Thats true, but comes with a cost. TS has become an incredibly complex language, even it only provides types. Thats being said is will always lack features as it only "javascript".
Haxe has a more robust, but simpler typesystem, that comes from ocaml.
Haxe also compiles to C++ so making native tools would have been a breeze.
TS sits at the top chair, but there is many "better" options out there, like Elm (even its kild of a dead languge) and ReScript/ReasonML etc.
TS is good, but will never be a perfect language for javascript. It will always be a mediocre option, that is growing more and more complex in every new release.
Yes, amazing language - I recall having a look at it in 2013 when I was scrambling for a replacement for Flash (also amazing platform). Shame it doesn't solve the problem at hand.
Hardly anyone cares TypeScript isn't perfect when they can migrate their (or anyone else's) terrible, but business-critial JS library to TS and continue development without skipping a beat.
For the same reason ReasonML took years to overtake fartscroll.js in the number of stars on GitHub.
A huge part of TS's complexity is there so that library authors can describe some exotic behaviours seen normally only in dynamically-typed languages. When you're writing an app you don't need the vast majority of those features. Personally I regret every usage of `infer` in application code.
> For the same reason ReasonML took years to overtake fartscroll.js in the number of stars on GitHub.
Wow, that's a fascinating statistic! Certainly puts the popularity delta into a different light.
On a separate note, the fartscroll.js demo page[0] no longer works since code.onion.com is no longer accessible. Truly disappointing. Luckily their release zip contains an example.html!
This is pretty cool. The only (a huge issue imho) is the fact that the macbook screen does "not go all the way", meaning you cant use it as youd normally would draw or write (from a 90 degree angle)
I feel it depends whether you inspect and edit the code as part of the workflow, or just test what the AI produced and give feedback without participating in the coding yourself.
Most of the slop i witness is the latter. This is evident in huge multi 10K pull requests. The code is just an artifact, while the prompting is the "new" coding.
I almost always try to cap my API results to 10001 items, so it's known if there's "more", depending on the interface(s) used. And when there are more, there's usually other pattern problems with the UX that I'm just kicking a can down the road on at that point.
Even then, far less than an amount one needs to be concerned of, for "performance"
Other community members are flagging many of your posts and emailing us to complain because you're repeatedly breaking the guidelines. Whether or not I recognize a decade-old meme is irrelevant. The guidelines have been in place for most of HN's history and apply equally to everyone.
Low-quality comments and projects are unwanted on HN whether they're generated by LLMs or humans. If your reaction to seeing bad stuff is to post more bad stuff, you're just making the problem worse. Anyone who is earnest about wanting to reduce low-quality content on HN can do so by flagging bad posts and emailing us (hn@ycombinator.com) so we can investigate further. Several users do that routinely.
HN is only a place where people want to participate because others make an effort to raise the standards rather than dragging them down. Please do your part to make HN better not worse.
I can assure you im not using LLMs to generatare any post, comment or reply. But 7/10 posts here on HN are infact just that. Endless AI generated posts, or links to some prompted mess of a project. None of the above are moderated, and/or removed.
Im not avare of the flagging system, but it seems not to be working because the amount of slop increases each month, and thats a big shame as HN news is one of the last "oldschool" forums we have left.
This kind of sermonizing and hyperbole is no better than LLM-generated content, as it takes no more effort to produce and contributes no more of value to conversations.
The Internet has always been an arms race between producers of generic dreck and those who put effort into crafting original thoughts and prose. Yes, it's changing, and that poses new challenges for us, which we're developing ways of addressing.
If you’re not going to bother to use the tools we’ve always provided to enable the community to alert us to low-quality content, and you’re not going to bother conversing curiously as has always been core to the site’s ethos, you’re not in a position to white-knight on this issue.
This site is only a place you think needs defending as one of the last "oldschool" forums we have left because others do their part to help.
We will keep working to raise the standards of content on HN. You need to decide if you want your role to be one of making things worse or better.
Let’s not forget, this whole exchange started when you introduced foul rage bait from a 15-year-old meme, so you’re not starting on solid ground here.
The webs downfall started with AI. Soon everything will be AI generated, from text posts, code that is shared with "look what i made, its cOoL", videos, podcasts etc. The himan touch will be gone, and new models are then trained on AI generated content, making the feedback loop worse and worse.
It is time for a new web. A new standard, a new everything. A new start without the AI bloat. Either something like this will emerge, or we will loose the web we have.
Yeah, I tend to agree, just trying to check my assumptions. There was always some newfangled thing killing the web every other time too.
I would also say that those pronouncements of doom were not necessarily wrong, but that it's been more of a long decline than a fall. I think my expectation is for that to continue.
Not a Odin user, but iirc odin also has Go like zero values. There is no practical option unless you have null. Like a string cant be null, its "at least" an empty string. But whats a base value for a pointer? A function? An interface? Either you wrap (ocaml style) or use null. Its pragmatism vs correctness, a balance as old as computing.
Odin's type system is just different to many other languages, and trying to compare it to others doesn't always work out very well.
`Maybe` does exist in Odin. So if you want a `nil` string either use `Maybe(string)` or `cstring` (which has both `nil` (since it is a pointer) and `""` which is the empty string, a non-nil pointer). Also, Odin doesn't have "interface"s since it's a strictly imperative procedural language.
As for you question about base values, I am not sure what you mean by this. Odin hasn't got a "everything is a pointer internally" approach like many GC'd languages. Odin follows in the C tradition, so the "base value" is just whatever the type is.
LOL No. AI code i see is 90% really bad. The poster then snakes around the first commenter that asks "how much of the code was generated by AI?"
Replies vary from silence to "ill checked all the code" or "ai code is better than human code" or even "ai was not used at all", even it is obvious it was 100% AI.
Also it would have been interesting sto ser what influence Haxe would have had on javascript itself