Overall, a bad sign for Stack Overflow. When executives write gobbledygook paragraphs such as:
“ Since my last quarterly update, companies across nearly every sector have experienced significant transformation—whether it’s a more aggressive focus on profitability or a shift in product strategy due to the acceleration of generative AI (GenAI). Thematically, however, one thing has remained the same: companies are committed to driving productivity and efficiency throughout their organizations. At Stack Overflow, we continue to help our customers and community deliver both.”
The unclear and muddy writing reveals that their thinking is likewise unclear.
Generalities instead of specifics. Vague platitudes. This is poor writing.
StackExchange is facing an existential threat and there is nothing in this letter that names it, identifies steps they’ll take to address, or how they will thrive in the new future.
It is not unexpected, but a confirmation of what we all might fear.
A lot of people have gotten very high up on the corporate totem pole for their ability to string together vague muddy bullshit that sounds pleasantly corporate but says nothing. The ability to write this way has become this weird corporate class signifier. I’ve seen normal people promoted into executive management and then all their emails start sounding like this. It’s almost like someone prompted ChatGPT to write a company strategy email without telling the tool what the company does.
I guarantee nobody at the company dares to critique this guy’s writing like you did. His sycophants at the SVP level nod along, pretending the emperor has clothes, and that they all understand what to do now, as they share similar, vague thoughts with their own teams.
There's an existential threat, but I it's more than just GenAI. SO is such an easy target for many reasons: Creation of an extremely hostile culture to users (how do you change it now?) Old information that's no longer relevant yet "close culture" prevents anyone from posting "duplicate" questions even when they're needed. A reputation among the newer generation that SO is "read only". High schoolers learning to program would never dare to ask a question there. As a business, SO has no value.
> Creation of an extremely hostile culture to users (how do you change it now?)
By changing the community guidelines to prohibit such behavior, and then aggressively banning users who engage in it.
Of all the problems Stack Overflow faces, this one would be the easiest to solve, but the people who run the site have made it abundantly clear long ago that they just don't give a fuck.
My favorite are executives whose products are paying customers of generative AI (aka no moat), yet they parrot platitudes as if they were at the driver's seat.
This assumes AI will be able to offer support outside the trained corpus of human knowledge and generate code with post-training-date apis. These will probably still require human support and feedback so that subsequent AIs can also support them.
This means companies like stack overflow will require licensing and compensation for AI corporations to use new data - it won't be free anymore.
Data hoarding is the future, not open data sharing (because everyone will fear being replaced). We will see a more closed future behind paywalls. It will be very expensive for subsequent versions of ChatGPT etc. to be trained with up to date knowledge outside of Wikipedia etc.
> Data hoarding is the future, not open data sharing (because everyone will fear being replaced). We will see a more closed future behind paywalls. It will be very expensive for subsequent versions of ChatGPT etc. to be trained with up to date knowledge outside of Wikipedia etc.
Well StackExchange seems pretty poorly positioned then because their data is under the Creative Commons.
(Although notably, I only spend time writing answers on StackExchange because I feel like I'm contributing to an open corpus of knowledge; the license is a major part of that.)
> StackExchange seems pretty poorly positioned then because their data is under the Creative Commons
(NAL) the license legally allows others to reproduce the data, so they can't be sued for copyright infringement. But it wouldn't stop SO from implementing rate limits, registration wall, or paywall. They didn't undertake on a mission to freely and openly spread your answers to as many people (or bots) as possible — they only promised not to sue for copyright.
> “They are creating this culture on Stack Overflow where it’s become a really safe place to ask a question, a safe place to provide an answer, and try to get people closer to their solution, and I think that’s part of what’s made it really successful and where we found a lot of value.”
Is this satire? Stack Overflow is infamous for how hostile its culture is towards people asking questions. And I'm sure that the CEO is well aware of that. The fact that they're dropping a quote like this one into the post without a single word even attempting to address SO's well-known culture problems is a slap in the face to anyone familiar with the platform.
Anyone care to explain to me what is mean by "safe place"? Was it not safe before? Are Q&A platforms generally known to be unsafe to post on? Genuinely curious. It's such a peculiar choice of words it certainly has intrigued my senses.
Why SO thinks this is the biggest problem they should be solving, or why it took four years to roll out, is beyond me; for one, I can't ever recall discussing or caring about anybody's genitals or lack thereof when posting coding Q&As.
I dislike the proliferation of this word. "Safe" has started to mean "free from critical commentary" in a lot of contexts. It's hampering objective discourse all over the place.
It used to mean physically safe - ie. A gay bar - at least safe from outsiders. Once you move to “emotionally safe” then it basically becomes so vague as to be meaningless
I believe it's called 'virtue signalling', it's a hot phrase (though perhaps past its peak by a few years) and associating yourself with it is only really a positive thing.
What does it actually mean? Yeah, probably nothing. In fact as others say it might ironically actually not be true (an 'unsafe place', 'hostile to newcomers') in SO's case.
The answers from SO users on that last thread show how much they lack self-awareness. Nowadays the only way I end up on SO is through a google link, I 100% consider it a read-only site.
That may be true, but there is usually a limit to how much hypocrisy such posts contain. I'm pretty sure that after the Deepwater Horizon oil spill, BP didn't put out press releases containing quotes that praised the company's stellar environmental record. The above is pretty much the equivalent of that for Stack Overflow.
They’re saying that it’s a safe space because you and your gender/pronouns/etc will be highly respected as your question is closed as redundant to an unrelated question from fifteen years ago.
They have no idea how to solve the real underlying problems so they’re rearranging the deck chairs on the titanic.
I can just imagine their internal mod training materials:
Problematic: "user64738's answer is an obvious duplicate of [URL], his answer is stupid, and he is stupid."
Consider changing to: "user64738's answer is an obvious duplicate of [URL], *their* answer is stupid, and *they* are stupid."
Real life is becoming satire. I can't wait to read business books 40 years from now. "In the beginning of the 21st century, corporations erased trillions of dollars of shareholder value. But don't forget: While they were incinerating money and laying off workers, they all changed their internal technical processes to rename 'master' to 'main' and changed their codes of conduct to stop misgendering people."
I'm mostly active in a corner of SO where any questions are rarely closed. They are just mostly left without answers. Looking at most of them, they describe extremely convoluted setups, involving so many parts that likely nobody (except the author) has.
> We want to be able to continue to invest back into our community, and that’s why we’re also exploring what data monetization looks like in the future. LLMs are trained off Stack Overflow data, which our massive community has contributed to for nearly 15 years. We should be compensated for that data so we can continue to invest back in our community.
How exactly are you planning on investing that back into the community? It sounds like it just wants to be YOU who captures the value of 15 years of community investment. It's fine, SO is a business, and people agreed to the TOS, but this is laughably blatant bs.
The blog post is entirely marketing and tries to obfuscate or ignore the trouble AI-generated content is causing on SO. The company has enacted a deeply unpopular policy this week that essentially makes it impossible to moderate most AI-generated posts on Stack Overflow.
Laying off 10% of the company after putting another 10% of the company on developing AI-anything do look more like panic to me. AI-generated content is threatening one half of their business model, and additionally causing moderation issues as well on the public sites.
Nothing here directly addresses the issue of users migrating away from the flow of `Google search -> SO` to `ChatGPT/Phind`.
Maybe SO should just acquihire Phind. Use Phind as the gatekeep to filter out questions with well established answers before another doe-eyed user make the fatal mistake of asking a non well-formed question before they are excoriated by the poweruser #44956.
It is funny to read all this "SO is doomed and AI is good enough". It shows that people have basically no clue how LLM's work, ascribing superpowers (which is ofcourse congruent with the hype)
If the free humam labor solving diverse coding problems in public repositories stops, any such trained LLM solution will gradually degrade, e.g it will have no material around new libraries, new languages etc.
In my opinion, SO doesn’t really have much of a choice here. Not only has the quality of questions and answers been on the decline, the entire SO experience is notoriously hostile for new comers. I find that ChatGPT/GitHub Copilot tend to give much better answers for the vast majority of programming related questions. It’s much faster, easier, and less intimidating to ask your tools questions knowing that you’ll most likely get a high quality answer and not get admonished for having asked the question in the first place.
> I find that ChatGPT/GitHub Copilot tend to give much better answers for the vast majority of programming related questions
I'm not so sure about this.
Firstly, ChatGPT seems to hallucinate quite a lot, which can result in me wasting a lot of time, especially when the hallucinations aren't immediately apparent.
Secondly, presumably the content of Stackoverflow had been used to train ChatGPT... so if Stackoverflow and all of it's human-generated content disappeared, to be replaced by AI-generated content, that could be a bad thing for ChatGPT in the long run.
Anyone else think Stackoverflow has been in a decline the past decade?
Stackoverflow hardly surfaces when I am searching for help and when I find answers it's usually a blog or a hosted blog somewhere. Most Stackoverflow answers are usually 4-5 years old and outdated.
Perhaps it's simply my engineering has improved in the last decade and I no longer search for help very often and when I do it's outside the scope of Stackoverflow..
Imagine if every SO question automatically had a ChatGPT-generated answer applied to it, but in wiki fashion, so it could be modified to remove errors, add additional context, point out hallucinations etc.
At the same time, humans could still answer too, providing alternative/better solutions, which ultimately are fed back into ChatGPT.
Humans mostly wouldn't care about things so complex or remote as e.g. robots replacing them.
E.g. truck drivers wouldn't actively destroy cameras attached to their trucks, even if you tell them that the video feeds from these cameras could be fed to models that might allow driverless trucks, which might eventually replace them. The link between a camera and elimination of their work is too remote and too vague.
“ Since my last quarterly update, companies across nearly every sector have experienced significant transformation—whether it’s a more aggressive focus on profitability or a shift in product strategy due to the acceleration of generative AI (GenAI). Thematically, however, one thing has remained the same: companies are committed to driving productivity and efficiency throughout their organizations. At Stack Overflow, we continue to help our customers and community deliver both.”
The unclear and muddy writing reveals that their thinking is likewise unclear.
Generalities instead of specifics. Vague platitudes. This is poor writing.
StackExchange is facing an existential threat and there is nothing in this letter that names it, identifies steps they’ll take to address, or how they will thrive in the new future.
It is not unexpected, but a confirmation of what we all might fear.