Well, whose fault is it that all things have .net in their names? I meant the .net frameworks in my original comment (even though the runtime environment changed a lot too, causing pain for deployment - read some old Paint.net blogs to find out more).
My point was that it doesn't matter if c# or .net continue to exist or not, all it matters for me is whether one can count on one's skills being still valuable. My Visual C++, WinAPI (bit WinRT is replacing it) skills are still fine, my C# & .net framework that I learned back in 2006 are nearly useless, because a lot of stuff changed.
Why do companies expect developers to always gobble up whatever they throw over the wall and say "thank you sir, may I have more sir?", as if it's our duty to relearn how to do X for the Nth time, this time with bells and whistles?
More importantly, why do devs like investing in such ephemeral knowledge?
What about community-driven languages that are rapidly evolving? In that case, would you say that the community expects developers to always gobble up whatever is thrown over the wall? (See: web frontend development)
> "as if it's our duty to relearn how to do X for the Nth time, this time with bells and whistles?"
There are a few areas of computing that are fairly stable, if stability of knowledge is your priority, you should seek one of those out.
But in many other areas of computing, our discipline is young and rapidly evolving. Some of that evolution is being driven by a corporate need to sell the next version. But plenty is driven by the rapid evolution in languages and paradigms. And why is that? Because we all know that programming can still be (much) easier, more reliable, more predictable, cheaper, and more fun. This is why many devs, myself included, like investing in "ephemeral" knowledge.
Honestly, I understand that the decay of hard-earned skills is distressing for many people. Not all fields have this characteristic. But in our field, it just comes with the territory.
The web front-end is shooting itself in the foot in a spectacular way and is a pathological example of software entropy. :) Personally I am seeking thay out - the stuff I work on has existed since before I was born and is evolving in quite a rational manner.
I don't mind at all when a language is evolving, that's indeed to be expected and manageable, because one can leverage existing knowledge and augment it. But a lot of change is either driven by corporate interests as you mentioned or by fashion and is disruptive, as in replace X with Y. And it's not clear how Y is more robust or faster, usually it's a mixed bag of pluses and minuses.
My point was that it doesn't matter if c# or .net continue to exist or not, all it matters for me is whether one can count on one's skills being still valuable. My Visual C++, WinAPI (bit WinRT is replacing it) skills are still fine, my C# & .net framework that I learned back in 2006 are nearly useless, because a lot of stuff changed.
Why do companies expect developers to always gobble up whatever they throw over the wall and say "thank you sir, may I have more sir?", as if it's our duty to relearn how to do X for the Nth time, this time with bells and whistles?
More importantly, why do devs like investing in such ephemeral knowledge?