Hacker Newsnew | past | comments | ask | show | jobs | submit | fatbird's commentslogin

His last film was Spinal Tap II. I think if you could tell him that Spinal Tap would bookend his life, he'd be tickled by that.

The second installment isn't good... But he has more than enough decent work to be remembered by.

You have two applications you have to ensure are always in sync and consistent.

No, the point of the API is to loosely couple the frontend and backend with a contract. The frontend doesn't need to model the backend, and the backend doesn't need to know what's happening on the frontend, it just needs to respect the API output. Changes/additions in the API are handled by API versioning, allowing overlap between old and new.

Code is duplicated.

Not if the frontend isn't trying to model the internals of the backend.

Velocity decreases because in order to implement almost anything, you need buy-in from the backend AND frontend team(s).

Velocity increases because frontend works to a stable API, and backend doesn't need to co-ordinate changes that don't affect the API output. Also, changes involving both don't require simultaneous co-ordinated release: once the PM has approved a change, the backend implements, releases non-breaking API changes, and then frontend goes on its way.


> No, the point of the API is to loosely couple the frontend and backend with a contract. The frontend doesn't need to model the backend, and the backend doesn't need to know what's happening on the frontend, it just needs to respect the API output. Changes/additions in the API are handled by API versioning, allowing overlap between old and new.

This is the idea, and idea which can never be fully realized.

The backend MUST understand what the frontend sees to some degree, because of efficiency, performance, and user-experience.

If we build the perfect RESTful API, where each object is an endpoint and their relationships are modeled by URLs, we have almost realized this vision. But it cost us our server catching on fire. It thrashed our user experience. Our application sucks ass, it's almost unusable. Things show up on the front-end but they're ghosts, everything takes forever to load, every button is a liar, and the quality of our application has reached new depths of hell.

And, we haven't realized the vision even. What about Authentication? User access? Routing?

> Not if the frontend isn't trying to model the internals of the backend.

The frontend does not get a choice, because the model is the model. When you go against the grain of the model and you say "everything is abstract", then you open yourself up to the worst bugs imaginable.

No - things are linked, things are coupled. When we just pretend they are not, we haven't done anything but obscure the points where failure can happen.

> Velocity increases because frontend works to a stable API, and backend doesn't need to co-ordinate changes that don't affect the API output. Also, changes involving both don't require simultaneous co-ordinated release: once the PM has approved a change, the backend implements, releases non-breaking API changes, and then frontend goes on its way.

No, this is a stark decrease in velocity.

When I need to display a new form that, say, coordinates 10 database tables in a complex way, I can just do that if the application is SSR or Livewire-type. I can just do that. I don't need the backend team to implement it in 3 months and then I make the form. I also don't need to wrangle together 15+ APIs and then recreate a database engine in JS to do it.

Realistically, those are your two options. Either you have a performant backend API interface full of one-off implementations, what we might consider spaghetti, or you have a "clean" RESTful API that falls apart as soon as you even try to go against the grain of the data model.

There are, of course, in-betweens. RPC is a great example. We don't model data, we model operations. Maybe we have a "generateForm" method on the backend and the frontend just uses this. You might notice this looks a lot like SSR with extra steps...

But this all assumes the form is generated and then done. What if the data is changing? Maybe it's not a form, maybe it's a node editor? SSR will fall apart here, and so will the clean-code frontend-backend. It will be so hellish, so evil, so convoluted.

Bearing in mind, this is something truly trivial for desktop applications to do. The models of modern web apps just cannot do this in a scalable, or reliable, way. But decades old technology like COM, dbus, and X can. We need to look at what the difference is and decide how we can utilize that.


Fuel oil deliveries to smaller communities that don't buy in tanker quantities. Those boats are basically the u-hauls of the sea.

I keep thinking about how, 20 years, 3D printers became accessible for tech/price, and there was a lot of talk about how we'd all just print what we need (pretty much like in Stephenson's book Diamond Age), and you'd get rich selling digital patterns for material goods.

Instead, we got the elevation of the handmade, the verifiably human created, typified by the rise of Etsy. The last 20 years have been a boom time for artists and craftspeople.

I keep seeing AI slop and thinking that all this will do is make verifiably human created content more valuable by comparison, while generative AI content will seem lowbrow and not worth the cost to make it.


Sturgeon's law: 90% of everything is crap. Ergo we shouldn't be surprised if AI can replace 90% of supposed art and media in the long run. Not sure I will particularly notice since I can't stand most media already.

But good luck with that remaining 10% AI...

I think the reason 3D printers can't print anything is because most things are mixed media and 3D printers aren't so great at that yet. There are also issues with topology and the structural quality of 3D printed things compared to things put in a mold. And that's not entirely unlike people (who I once would have thought would know better ) oversimplifying the engineering and scientific challenges of AI for it to be human equivalent or better.


Creative people are on the whole a lot better at keeping AI slop out of art and craft fairs than they have been with the lazier output of 3D printers, CNC, laser engravers and off-the-shelf resin mould art.

It is as if the relatively unspoken feelings about the downsides of technologies as a gateway to art have been rapidly refined to deal with AI (and of course, even the CNC and laser engraver people have common cause).

But I think it is fair to say that if they feel success, there will be a growing pushback against the use of 3D printers, eufyMake resin printing and CNC in a niche where hand tools used to be the norm.

And speaking even as someone who has niche product ideas that will be entirely 3D printed/CNC cut/engraved, I don't really disagree with it. I am mostly not that kind of creative person (putting aside experimental photography techniques) and I see no reason why they shouldn't push back.

The reality is that "craft" fairs are an odd mix of people who spent a lot of time refining their art and selling things that are the work of hours of expressive creativity and effort, and table upon table of glittery resin mould art and vinyl cut stencil output stuck on off-the-shelf products. I think AI might help people refine their feelings about this stuff they once felt bad/incorrect/unkind about excluding.

It's a bit like the way the art photography market is rediscovering things like carbon printing, photosensitive etching and experimental cyanotype, and getting a lot more choosy about inkjet-printed DSLR output.


There's nothing wrong with technical debt per se. As with all debt, the problem is incurring it without a plan or means to pay it off. Debt based financing is the engine of modern capitalism.

Gradual growth to large scale implies an ongoing refactoring cost--that's the price of paying off the technical debt that got you started and built initial success in small scale rollouts. As long as you keep "servicing" your debt (which can include throwing away an earlier chunk and building a more scalable replacement with the lessons learned), you're doing fine.

The magic words here to management/product owners is "we built it that way the first time because it got us running quickly and taught us what we need to know to build the scalable version. If we'd tried to go for the scalable version first, we wouldn't have known foo, bar and baz, and we'd have failed and wouldn't have learned anything."


My wife was a teacher and sexual health educator for most of her career (grades 8-12).

When I was getting sex ed, part of the teacher's responsibility was grounding us in basic facts to dispel word-of-mouth myths that were patently absurd to anyone with any experience (like "sneezing after sex prevents pregnancy").

My wife's tasks was to explain that the hardcore porn they'd all seen was unrealistic in the same way that action movies completely misrepresent fights and stunts, and the real world doesn't work that way. Her problem was that she was arguing with video evidence that it could. The kids aren't unhinged, but they're definitely misled in a completely different way than we were.


My ADHD became screamingly obvious when I quit smoking, about a year after seriously reducing my coffee drinking from a couple pots a day to a cup or two.


Peter Navarro is also a driving force behind the tariffs.


While US weapons aid has basically been cut off, then somewhat restored through European purchases, US intel sharing has been relatively consistent and continuous throughout, and Ukraine is very dependent on it. When intel sharing was suspended for several weeks, Ukraine lost almost half the ground it had taken in Kursk. At a minimum, satellite intel is key to monitoring Russian dispositions, and Ukraine has no way to replace that.


US also authorized the use of their own ballistic missiles in Russia proper this past week which was a big deal.

They also have another $1B budgeted in defense spending for Ukraine next year https://www.reuters.com/world/us/us-senate-committee-backs-m...


I agree, and the blame for its weirdness can be laid directly at Larry Wall's feet, because Wall wanted a language that allowed for cleverness, suprise, and ingenuity. He was never happier than when someone would come up with a completely new way to do something. For Wall, programming was less about coding an outcome, than it was about speaking a particular language (and ideally, writing poetry in it). And it was very successful in this way, and fit reasonably well with the high-knowledge users/environment of unix in the 90s.

It's just that Wall's vision was incompatible with general purpose languages used widely by a wide range of knowledge and skill amongst its users, and as unix/linux opened up to that wider range, better general purpose alternatives were chosen. Having to learn to be a poet to be a good coder was too high a barrier.


> Wall wanted a language that allowed for cleverness, suprise, and ingenuity. [...] Having to learn to be a poet to be a good coder was too high a barrier.

To me this just sounds, umm, pathologically eclectic.


Now extrapolate to "let's do a Perl 6 that allows us to do all the things I couldn't work into Perl 5" and a lot more history makes sense.


But I bet you could really list some rubbish with it…


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: