Hacker Newsnew | past | comments | ask | show | jobs | submit | 627467's commentslogin

I bet if you study the rate of "mind changing" over time since phones got smarter we'll see it correlates. As does ability/willingness to commit to anything or anyone.

An example of how european "tech" reacts to threats. 2 european open source projects in litigation with each other and one of them engineered a license to prevent an obvious feature of open source software (forking) while the other is throwing shades at opacity and geopolitical control at the first.

> I was made redundant recently "due to AI" (questionable) and it feels like my works in some way contributed to my redundancy where my works contributed to the profits made by these AI megacorps while I am left a victim.

I think anyone here can understand and even share that feeling. And I agree with your "questionable" - its just the lame HR excuse du jour.

My 2c:

- AI megacorps aren't the only ones gaining, we all are. the leverage you have to build and ship today is higher than it was five years ago.

- It feels like megacorps own the keys right now, but that’s a temporary. In a world of autonomous agents and open-weight models, control is decentralized.inference costs continue to drop, you dont need to be running on megacorp stacks. Millions (billions?) of agents finding and sharing among themselves. How will megacorps stop?

- I see the advent of LLMs like the spread of literacy. Scribes once held a monopoly on the written word, which felt like a "loss" to them when reading/writing became universal. But today, language belongs to everyone. We aren't losing code; we are making the ability to code a universal human "literacy."


> AI megacorps aren't the only ones gaining, we all are.

No, no we are not.

> the leverage you have to build and ship today is higher than it was five years ago.

I don’t want more “leverage to build and ship”, I want to live in a world where people aren’t so disconnected from reality and so lonely they have romantic relationships with a chat window; where they don’t turn off their brains and accept any wrong information because it comes from a machine; where propaganda, mass manipulation, and surveillance aren’t at the ready hands of any two-bit despot; where people aren’t so myopic that they only look at their own belly button and use case for a tool that they are incapable of recognising all the societal harms around them.

> We aren't losing code; we are making the ability to code a universal human "literacy."

No, no we are not. What we are, however, is making ever increasingly bad comparisons.

Literacy implies understanding. To be able to read and write, you need to be able to understand how to do both. LLMs just spit text which you don’t need to understand at all, and increasingly people are not even caring to try to understand it. LLM generated code in the hands of someone who doesn’t read it is the opposite of literacy.


>I don’t want more “leverage to build and ship”, I want to live in a world where people aren’t so disconnected from reality and so lonely they have romantic relationships with a chat window; where they don’t turn off their brains and accept any wrong information because it comes from a machine; where propaganda, mass manipulation, and surveillance aren’t at the ready hands of any two-bit despot; where people aren’t so myopic that they only look at their own belly button and use case for a tool that they are incapable of recognising all the societal harms around them.

Preach. Every time I read people doing this weird LARP on this website of "you have so much more leverage, great time to be a founder" I want to put my head through the drywall.


> literacy implies understanding

Agree. Do we not understand how LLMs work? Some of us understand better than others, just like literacy is also not guaranteed just because you learned the alphabet.

Accepting the output of an LLM is really materially not different from accepting books, newspapers, opinion makers, academics at face value. Maybe different only in speed of access?

> LLM generated code in the hands of someone who doesn’t read it is the opposite of literacy.

"A popsi article title or paper abstract/conclusion in the mind of someone who doesn't read is the opposite of literacy."


I’m not sure I understand your point. Mind clarifying? It seems you might be trying to contradict what I said but are in fact only adding to it.

> just like literacy is also not guaranteed just because you learned the alphabet.

I didn’t claim learning the alphabet equals literacy, you did. Your argument comes down to “you’re not literate if you’re not literate”. Which, yes, of course.

> Accepting the output of an LLM is really materially not different from (…)

Multiple things can be true at once. If someone says “angry stupid people with machine guns are dangerous”, responding “angry stupid people with explosives are dangerous” does nothing to the original point. The angry stupid people are part of the problem, sure, but so are the tool which are enabling them to be dangerous. If poison is being dumped in a river and slowly killing the ecosystem, then someone else comes along wanting to dump even more of a different poison, the correct response is to stop both, not shrug it off and stop none.


> I am left a victim

> I want to live in a world where people aren’t so disconnected from reality

It looks like you are the problem, not the world. Hope you find happiness!


What the bloody heck are you on about? That first quote is completely fabricated. I’d also like to live in a world where people don’t argue in bad faith, but since I have no pretence that will happen, at least I’m thankful when bad faith actors do such a poor job of concealing it.

But LLMs can also explain code, in fact they're fantastic at that. They can also be used to build anti-censorship, surveillance-avoidance and fact-checking tools. We are all empowered by them, it's just up to us to employ them so as to nudge society towards where we'd like it to go. Instead of giving up prematurely.

I’m not sure if the analogy is yours, but the scribe note really struck a chord with me.

I’m not a professionally trained SWE (I’m a scientist who does engineering work). LLMs have really accelerated my ability to build, ideate, and understand systems in a way that I could only loosely gain from sometimes grumpy but mostly kind senior engineers in overcrowded chat rooms.

The legality of all of this is dubious, though, per the parent. I GPL licensed my FOSS scientific software because I wanted it to help advance biomedical research. Not because I wanted it to help a big corp get rich.

But then again, maybe code like mine is what is holding these models back lol.


Sharing for advancing humanity / benefit of society, and megacorps getting rich off it, is not either-or. On the contrary, megacorps are in part how the benefit to society materializes. After all, it's megacorps that make and distribute the equipment and the software stacks I am using to write code on, that you are using to do your research on, etc.

I find the whole line of thinking, "I won't share my stuff because then a megacorp may use it without paying me the fractional picobuck I'm entitled to", to be a strong case of Dog in the Manger mindset. And I meant that even before LLM exploded, back when people were wringing their hands about Elasticsearch being used by Amazon, back in 2021 or so.

Sharing is sharing. One can't say "oh I'm sharing this for anyone to benefit", and then upon seeing someone using it to make money, say "oh but not like that!!". Or rather, one can say, but then they're just lying about having shared the thing. "OSS but not for megacorps/aicorps" is just proprietary software. Which is perfectly fine thing to work on; what's not fine is lying about it being open.


> "OSS but not for megacorps/aicorps" is just proprietary software

why? it's not like it's binary. It could well be that it's open source but can't be used by a company of X size. I'm not a lawyer but why couldn't a license have that clause? I would still class that as being open, for some definition of open


LLMs are one thing, but when you bring ES in AWS example, as outlined in the article, the problem is not the software being used; it's being _made proprietary_. It's about free and open software remaining free and open. Especially to the end user.

> On the contrary, megacorps are in part how the benefit to society materializes.

That would be true if they were the product of a genuine competitive market.

In fact their strength is in eliminating competition, erecting barriers to entry, manipulating regulation, and maintaining the status quo.

> "OSS but not for megacorps/aicorps"

Who is advocating that? People just want everyone to stick to the terms of the licences.


> We aren't losing code; we are making the ability to code a universal human "literacy."

The same way that doordash makes kitchen skills universal.


You say it like it's a bad thing.

I say that like it's a thing. LLMs have the goal of replacing intellectual work with passive consumption. People seem to like that.

Basically, the selling point of LLMs is that you no longer need to think about problems, you can skip directly to results. Anything that you have to think about while using them today is somewhere on the product roadmap, or will be.

Many people think this is a form of utopia.


Just like computer is no longer a job description, yes.

No, they are saying it like the comparison doesn’t hold. Which it doesn’t.

> It feels like megacorps own the keys right now, but that’s a temporary.

Remains to be seen. Hardware prices are increasing. Manufacturers are abandoning the consumer sector to serve the all consuming AI demands. Not to mention the constant attempts to lock down the computers so that we don't own them.

What does the future hold for us? Unknown. It's not looking too good though. What good is hardware if we're priced out? What good are open models and free software if we're unable to run them?


The trend I see if older hardward beeing able to run models that are increasing miniturized.

The real (but not new) danger is us giving up to the idea that we cant do it ourselves or that we must use megacorp latest shiny toy for us to "succeed"


welcome to late capital, please enjoy the ride while people are trying to tell you that LLMs are the only future (you have no future) while SOTA models can barely do shit on their own consistently outside of carefully designed benchmarks, and have to be made available at a loss otherwise no-one would use them.

On your right you can see the CEOs justifying longer hours and lower pay because AI will replace your job one day anyways, and then asking you why you aren't 10x more productive with Claude. On the left you can see the AI companies deciding who will be in charge of the fascist regime once they no-longer need workers other than for the coal mines. They reckon they can get 120 good years before they biosphere is uninhabitable, which they are worried about because what if the next LLM figures out immortally for them, maybe they will have to close the coal mines too after all.


Can't say I disagree with you. I do recognize that we seem to be heading towards a technofeudalist cyberpunk dystopia. The only way out for humanity is to automate everything to the point we transcend capitalism into a post-scarcity society where the very concept of an economy has been abolished. If we can't do that, we'll become soylent.

>But today, language belongs to everyone. We aren't losing code; we are making the ability to code a universal human "literacy."

Literacy require training though. It’s not the same to be able to make voice rendition of a text, understand what the text is about, have a critical analysis toolbox of texts, and having the habit to lookup for situated within a broader inferred context.

Just throwing LLMs into people hands won’t automatically make them able to use it in relevant manner as far as global social benefits can be considered.

The literacy issue is actually quite independent of the fact that LLMs used are distributed or centralised.


> We aren't losing code; we are making the ability to code a universal human "literacy."

LLMs making the ability to code a universal human “literacy” is like saying that Markov chain is making the ability to write a universal human “literacy”.


Comparing LLMs to Markov chains was funny in 2023.

I’m not comparing LLMs to Markov chain, read again.

Coding through LLMs is like writing through Markov chains.


Once creation is commodotized, controlling eye balls is king. Look up aggregators. Apple, Facebook, Microsoft, Amazon, etc.

If anything, in Extremistan we're all useless. Platforms and whales are all that matters.


The literacy analogy makes sense in terms of access.

But the tools back then were cheap and local. Now most of the leverage sits behind large models and infra.

So more people can “write”, but not necessarily on their own terms.


Cheap books too hundreds of years to be accessible. Already we have models that run on "legacy" hardware. Just like large scale publishing never disappeared large scale models and infra also wont. But does it mean that simple paper and pen was pointless to be distributed?

This response sounds an awful lot like what ChatGPT would say ...

> the leverage you have to build and ship today is higher than it was five years ago

Wake me up when you do.


If having a fig leaf of an excuse really better? Its not like the acts or methods being now clearly observed now are anything new. Its just they aren't disguised

It's falling that people support this without even pretending that it's about something other than their hatred for you.

In the past we had to pretend to be civil, and that illusion let us get back to work between elections. The utter disregard for the law forces us to grapple with the fact that they have so little respect for us that they will not care if there isn't another election.


Thats great for you (and i guess for most of us here on HN).

But was access to that outlet really that free for you? I remember our main computer being in the middle of the living room where everyone in the family could potentially see what i was doing on the computer. I remember dial up being extremely expensive (or "broadband" have really low monthly caps) or the connection dropped the moment anyone would pickup the phone at home. Or use of computer/internet in schools being in public. I also remember all i had to learn to overcome these limits and the choices (cost/benefit analysis) i had to make to overcome those barriers. Those barriers not only provided the learning opportunities but also the necessary friction to reevaluate patterns and decisions.

Do you think the current state of access really replicates that? Are barriers really only "bad"?


Oh we definitely are in need of more 15s "journalism"

> typical GOS user generally doesnt want to do that

How do you know this? Is there an official (or even unofficial) source of GOS preinstalled devices that a substantial amount of "typical GOS user" has acquired?

Or maybe you are talking about "potential user of GOS"?

In any case: if you installed it yourself you mostly have to trust the source of the installer. If you purchase a pre-installed device you're basically back to the android/ios model: you have to trust the manufacturer AND the maker of the OS


I have helped a significant number of GOS users install GOS to their device. If you perform post install steps correctly then you do not need to trust where you got it from, as the post install steps are there to verify your install is genuine. If GOS gets greenboot support for motorola devices, then not getting a yellowboot screen will show it is genuine and you wont need to trust anything.


Curious about your setup of qwen on m1 pro. Care to share the toolkit?


Hey its not suing me so i guess i shouldnt care?

But what is the argument here? "OpenCode facilitates the users of their opensource tool to misuse another app they have installed"?

I guess anything goes with ip law really. Its all about flexing lawyer power and willingness to drown opponents in legal costs.

Maybe if you dont want people to misuse your sub dont ship the ability to do so in the app that users actually installed on their machines?

This is the same as all the alternative youtube clients. Just play the cat and mouse game Anthropic


You forgot to mention the part where EU model is to export demographic deficit which is "purchased" by the young generations of other countries stiffling their growth and displacing the dwindling younger european generations.

"Dont buy cheap tshirts from bangladesh just bring its youth in" is apparently morally superior model


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: