Insert here the New Yorker cartoon about the shabby business executive around a campfire with a bunch of kids crowing "Yes, the planet got destroyed, but for a beautiful moment in time we created a lot of value for shareholders."
I can generally re-find my place in books, but years ago I acquired a stack of orange punch cards from a university library that they were giving away as scrap paper. These make great bookmarks and also interesting historical conversation pieces if someone notices/recognizes them.
I think the previous use for the punchards to have one for each book and scan them on checkout/checkin (maybe this predated barcodes?)
Oxc is not a JavaScript runtime environment; it's a collection of build tools for JavaScript. The tools output JavaScript code, not native binaries. You separately need a runtime environment like Deno (or a browser, depending on what kind of code it is) to actually run that code.
Deno is a native implementation of a standard library, it doesn't have language implementation of its own, it just bundles the one from Safari (javascriptcore).
This is a set of linting tools and a typestripper, a program that removes the type annotations from typescript to make turn it into pure javascript (and turn JSX into document.whateverMakeElement calls). It still doesn't have anything to actually run the program.
I'm going to call it: a Rust implementation of JavaScript runtime (and TypeScript compiler) will eventually overtake the official TypeScript compiler now being rewritten in Go.
Nothing, but it will happen anyway. Maybe improved memory safety and security, at least as a plausible excuse to get funding for it. Perhaps also improved enthusiasm of developers, since they seem to enjoy the newness of Rust over working with an existing C++ codebase. Well there are probably many actual advantages to "rewrite it in Rust". I'm not in support or against it, just making an observation that the cultural trend seems to be moving that way.
If you want native binaries from typescript, check my project: https://tsonic.org/
Currently it uses .Net and NativeAOT, but adding support for the Rust backend/ecosystem over the next couple of months. TypeScript for GPU kernels, soon. :)
No, it it a suite of tools to handle Typescript (and Javascript as its subset). So far it's a parser, a tool to strip Typescript declarations and produce JS (like SWC), a linter, and a set of code transformation tools / interfaces, as much as I can tell.
Going a bit meta - this blog seems strange as its only other story is criticizing a member of the go community. The OP has posted this story, done so twice (first time was flagged) and has no other comments on HN.
There may also be a downvote brigade in this comment section.
I think this must be a bit. On the one hand you have this story about Bernstein, someone who has made a pastime out of weaponizing process in consensus organizations to drag progress to a halt when he's failed to coerce his preferred outcome; on the other hand you have a story villainizing Filippo Valsorda for not doing that, and avoiding standards organizations altogether.
I first encountered djb's work back in the 90's with qmail and djbdns, where he took a very different and compartmentalized approach to the more common monolithic tooling for running email and DNS. I'd even opine that the structure of these programs are direct ancestors to modern microservice architectures, except using unix stdio and other unix isolation mechanisms.
He's definitely opinionated, and I can understand people being annoyed with someone who is vociferous in their disagreement and questioning the motives of others, but given the occasional bad faith and subversion we see by large organizations in the cryptography space, it's nice to have someone hypervigilant in that area.
I generally think that if djb thinks something is OK in terms of cryptograpy, it's passed a very high analytical bar.
The main problem with technology coverage is you have one of 3 types of writers in the space:
1. Prosumer/enthusiasts who are somewhat technical, but mostly excitement
2. People who have professional level skills and also enjoy writing about it
3. Companies who write things because they sell things
A lot of sites are in category 1 - mostly excitement/enthusiasm, and feels.
Anandtech, TechReport, and to some extent Arstechnica (specially John Siracusa's OS X reviews) are the rare category 2.
Category 3 are things like the Puget Systems blog where they benchmark hardware, but also sell it, and it functions more as a buyer information.
The problem is that category 2 is that they can fairly easily get jobs in industry that pay way more than writing for a website. I'd imagine that when Anand joined Apple, this was likely the case, and if so that makes total sense.
When Andrei Frumusanu left Anandtech for Qualcomm, I'm sure he was paid much more for engineering chips than he was for writing about them, but his insight into the various core designs released for desktops and mobile was head and shoulders above anything I've seen since.
It's a shame that I can't even find a publication that runs and publishes the SPEC benchmarks on new core designs now that he is gone, despite SPEC having been the gold standard of performance comparison between dissimilar cores for decades.
There are still places that benchmark, but mostly for commercial apps like Puget Systems in the earlier post. Phoronix can also be useful as well for benching open source stuff.
I wouldn't put much trust in well-known benchmark suites as in many cases proprietary compilers, a huge amount of effort was put into Goodhart's law optimizing to the exact needs of the benchmark.
> The problem is that category 2 is that they can fairly easily get jobs in industry that pay way more than writing for a website
This is true, but those jobs are much worse than writing jobs. So it comes down to how much you value money and what it buys. Most people earning "way more" are spending "way more" to try to pay back the soul debt the job takes away. When you dig deep, it's not "way more" utility.