The situation is incredibly complex, and explaining it in full would need a book. The blog post is clear enough for the prople who have followed LibreOffice as a project, while other people have to do some research to understand all the details.
>Next: the runtime itself. Bun has a bun build --compile flag that produces a single self-contained executable. No runtime, no node_modules, no source files needed in the container.
I didn't know that. So Bun is basically a whole runtime + framework all in one with little to no deployment headaches?
The bun build creates a large self-contained executable with no optimisations. Almost like a large electron build.
Deno also provides the same functionality, but with a smaller optimized binary.
Appreciate Bun helping creating healthy competition. I feel like Deno falls under most people's radar often. More security options, faster than Node, built on web standards.
Ideally we would still only use JavaScript on the browser, personally I don't care about about the healthy competition, rather that npm actually works when I am stuck writing server side code I didn't ask for.
FE-BE standardization is efficient in terms of labor and code migration portability, but I really like the idea of static compilation and optimization of the BE in production.. there's absolutely no need or reason for the BE to do dynamic anything in prod. As long as it retains profiling inspectability when things go wrong.
That doesn’t align with my experience. It feels more like a trojan horse. Client and Server rarely (should) share code, and people that are really good at one discipline aren’t that good at the other. Maybe LLMs will change that.
C. 2015 one of my friends was a Django dev but moved to Express/node because that's where the cool kids went, it was one less language, and allowed them to move logic FE->BE and BE->FE much easier. Also, a bunch of Rails people left to Node/FE JS and Rust (BE). JS/TS is still an irreducible requirement for FE. There is no law that either grand unified frameworks must be used nor entirely separate FE and BE must be maintained separately and are somehow mysterious, arcane arts. Not sharing code when/where it is possible and appropriate is duplicating effort.. like client- and server-side input validations doing exactly the same thing.
Except we have moved beyond that with SaaS products,agents, AI workflows.
The only reason I touch JavaScript on the backend instead of .NET, Java, Go, Rust, OCaml, Haskell,.... are SDKs and customers that don't let other option than JavaScript all over the place.
Thus my point of view is that I couldn't care less about competition between JavaScript runtimes.
SEA with node.js "works" for nearly arbitrarily general node code -- pretty much anything you can run with node. However you may have to put in substantial extra effort, e.g., using [1], and possibly more work (e.g., copying assets out or using a virtual file system).
>I have 25 Mbps up. 10 Mbps down. Have had it for years. It's fine.
Do you mean the other way around, 25Mbps Down and 10 Mbps up?
It is nice to have, especially when it doesn't cost much. That is why I am perfectly OK with PON rather than dedicated fibre. You only need the 1 or 10Gbps speed for may be a 10 min window per month.
I do think 25Mbps on a house hold bases is quite low. On a 5Mbps Video file I want the first 10 second buffer, 50Mbps done instantly. While I am loading multiple page in the background. Multiply that with a few more user in family. It is perfectly useable a you said, if you dont mind waiting.
Otherwise I think 50 - 100Mbps per person is generally the point we see law of diminishing returns.
Yes, I reversed the up/down bandwidths as you noticed, but didn't see the mistake until I could no longer edit.
> Otherwise I think 50 - 100Mbps per person is generally the point we see law of diminishing returns.
Right. Whether we think the diminishing returns are at 10 or 20 or 50 or 100 Mbps per user, there are diminishing returns.
The vast, vast majority of residences simply do not need symmetric 25 Gbps bandwidth, and it would be a massive waste of resources to try to build out a residential network providing that level of bandwidth, rather than prioritizing universal accessibility of 50 or 100 Mbps.
I'd liken it to the overprovisioning of EV batteries, particularly in North America. Many, many car owners would be perfectly satisfied with a car with a range of only (say) 60 miles or 100 km, and overall EV cost and adoption rate is hurt by the fact that leading-edge manufacturers, especially Tesla, were only building EVs with range of 5x that.
Even in Wireless / Mobile Carrier they have company like American Tower / China Tower that shares infrastructure cost. So none of this is new, I always thought the reason it is not done is because of company interest and politics. Internet should be treated just like Electricity and Water.
There are other things we could do without completely changing the dynamics or policy. We could mandate all home leasing and selling to have Internet Speed labeled. Giving consumer the knowledge and choice. And all future new home to have at least 1Gbps Internet and upgradable to 10Gbps or higher as standard. The market will sort itself out. And give government some space and room to further negotiate terms with companies.
Now the technical question, why no sharing? why point to point? why 4 fibre and not 2 or 8? And the no sharing is a little bit of gimmick, because at the end of the day everything is shared. The backbone has 100Gbps and you cant have 10 labours all using 25Gbps. I also dont think P2P make sense in a metropolitan city like Tokyo, New York or Hong Kong, especially in high rise, ultra high density buildings with limited space. When 50G-PON barely meet demands we are looking at 100G or 200G-PON. Individual fibre is simple not feasible in those settings.
>Now that the AV2 specification is publicly available
Draft of AV2 spec. Not final. I think they just tagged the AVM 14 release from their research branch. But personally it feels it is no where near final / finish status.
>Trying to milk the last drop before the patents expire?
Old licenses are grandfathered in previous pricing. So this isn't about milking, but likely a tactic specifically aiming at certain companies. But I am wondering why they bother to do this at this stage of the game.
I am hoping we could further innovate on top of H.264 to have a better patent free video codec.
In relation to the quality of its comment. I thought it was a fair. He just completely made up about false positives.
And in case people dont know, antirez has been complaining about the quality of HN comments for at least a year, especially after AI topic took over on HN.
It is still better than lobster or other place though.
reply