I think it's worse than that, it's a bubble. There I've said it. So much money is being poured into AI right now, and things are changing so fast, and products are being deployed, then matched, then overrun weekly, all with no regard to the law, safety, understanding any of the tech that's just been built, or even building a real business around it that it's become absolute nonsense right now.
Two anecdotes:
1. I saw a posting for a prompt engineer which had virtually no requirements beyond some passing familiarity with LLMs, who's job it was to think up clever prompts and archive them in a library. Salary - $350k+
2. I heard a real conversation between two highly trained technical folks around using an LLM to do a simple data transform from one wire format to another. Yes, let's use a cluster of GPUs and some faulty hacked together prompt to transform a well written structured format to another at speeds that approach molasses. Nobody had a clue as to how much the runtime costs of this would be. The solution to it being slow? Add more clusters. -- absolute idiocy.
We're burning money on this stuff like it's mid-80s Japan spending money on slightly different variations of pocket calculators and American real-estate. Meanwhile we're exploiting Kenyan workers to some of the worst filth humanity can produce in an effort to keep one of these AIs of producing child gore porn because it's illegal to pay first world people $2/hr to do the same job -- and there's not a psychotherapist to be found anywhere in the chain.
And then it's being pushed at the regular consumer as if it's some kind of knowledge oracle to replace the "horrors" of the search box that:
a) won't only know what the state of the world was when it was trained 2 years ago
b) won't produce a worthless hallucinated answer that could send somebody off to take a poison for a cold
This shit is terrible and I just used it to give me advice on updating my resume a few weeks ago for a job in the field.
<< heard a real conversation between two highly trained technical folks around using an LLM to do a simple data transform from one wire format to another. Yes, let's use a cluster of GPUs and some faulty hacked together prompt to transform a well written structured format to another at speeds that approach molasses. Nobody had a clue as to how much the runtime costs of this would be. The solution to it being slow? Add more clusters. -- absolute idiocy.
I.. can totally hear it in my mind.. including ISO20022 requirements happy talk.
My initial take to the posed question that it isn't just tech. Business orgs appear to be jumping in with both feet ( thankfully, my little corner of the universe seems more conservative for now ). Still, it does not disprove your statement that we are in a heavy hype train now.
edit: I do take issue with wire format being well written ( or structured ) at this point. There is a reason SWIFT had to back down a little on the aggressive timeline for example.
Two anecdotes:
1. I saw a posting for a prompt engineer which had virtually no requirements beyond some passing familiarity with LLMs, who's job it was to think up clever prompts and archive them in a library. Salary - $350k+
2. I heard a real conversation between two highly trained technical folks around using an LLM to do a simple data transform from one wire format to another. Yes, let's use a cluster of GPUs and some faulty hacked together prompt to transform a well written structured format to another at speeds that approach molasses. Nobody had a clue as to how much the runtime costs of this would be. The solution to it being slow? Add more clusters. -- absolute idiocy.
We're burning money on this stuff like it's mid-80s Japan spending money on slightly different variations of pocket calculators and American real-estate. Meanwhile we're exploiting Kenyan workers to some of the worst filth humanity can produce in an effort to keep one of these AIs of producing child gore porn because it's illegal to pay first world people $2/hr to do the same job -- and there's not a psychotherapist to be found anywhere in the chain.
And then it's being pushed at the regular consumer as if it's some kind of knowledge oracle to replace the "horrors" of the search box that:
a) won't only know what the state of the world was when it was trained 2 years ago
b) won't produce a worthless hallucinated answer that could send somebody off to take a poison for a cold
This shit is terrible and I just used it to give me advice on updating my resume a few weeks ago for a job in the field.
Fuck it.