I've been programming just since ~2010, but I've only ever saw majority prefer macs due to hardware (with exception being late intel macs) and linux on the regular PCs.
With exception of game devs, I've not seen person who _happily_ defaults to windows, not due to fact that they have to because of company policy or because company is too cheap for an Apple device.
Yes, developers used to like Microsoft. That was where all the money was, and Visual Studio was an extremely good IDE in the late 90s and early 2000s. And at the time, Microsoft's documentation was the best. C++, VB, and then .Net development combined with Sql Server (then a budget option) was a very enticing stack. Using ASP instead of Perl or ColdFusion or PHP was also attractive.
At the time Mac was still largely dominated by PowerPC and Classic OS. And Linux was still seen as an OS for hobbyists and universities. It was not taken seriously until well into the 00s and the 2.4 kernel. Sun was struggling with Java, and the unices were well into their decline from the 80s.
I would say that the transition was how much better Apache was than IIS when it came to operational and security issues.
Most companies forbid it though, since you're not covered by any legal protection - for example, Anthropic can use your data or code to train new models and more.
Any references on this? I hear this argument a lot. In fact, in a talk on AI last week I heard someone say:
"If you click the thumbs up button to rate a chat, the AI provider will use the contents for training, so our company's policy is never to click the thumbs up button"
That seemed so farcical I had a hard time taking this person seriously. Enterprise plans must give some strong guarantees around data usage, right?
Obviously I can speak only from my personal experience but just me I have 5 examples of companies that were “no AI, IP and all that” that are now full-on “every developer must use CC, Cursor…”
How many conpanes today don’t have “AI strategy” and are fearing will be left behind etc? In my small circle we went from “most are not using AI” to “none are not using AI” in somewhat short period of time
Microsoft already has all their business data in the form of handing document storage and emails. Trusting another of their services to also not use that data for Microsoft's own purposes is reasonable.
The capitalists realized that if they literally starve the working class there will be revolution. But if they produce enough so they can sustain (barely) rest of the people with 1% output while they consume 99%, it will be okay.
So don't worry, you'll have basic ~~income~~ soylent green.
They are. And we have processes to minimize them - tests, code review, staging/preprod envs - but they are nowhere close to being 100% sure that code is bug free - that's just way too high bar for both AI and purely human workflows outside of few pretty niche fields.
It's out to get you whether you have a credit card sized piece of plastic or not. Dying on that hill just creates so much wasted time and money for everyone.
I mean, did they ever?
I've been programming just since ~2010, but I've only ever saw majority prefer macs due to hardware (with exception being late intel macs) and linux on the regular PCs.
With exception of game devs, I've not seen person who _happily_ defaults to windows, not due to fact that they have to because of company policy or because company is too cheap for an Apple device.
reply