Which is why I question the usefulness of a computer science program/degree for the vast majority of dev positions. Learning how to learn would I imagine be the best use of such a degree; but then one with at least half a brain, given a few years in the field, starts to pick that up independently.
I chose to avoid computer science degrees because in those days, in the late 90's, it was pretty much 80-90% math classes. I don't regret getting a degree in a totally different discipline: it turns out learning philosophy, history, linguistics, writing, etc. etc. was far useful to me as a person (and, arguably, a worker) than some boring math classes ever would have been.
Have comp sci curriculums improved in that time? Given that job interviews these days last 4-8 hours, require group approval and whiteboarding in front of an audience, I would argue, NO. Obviously, no one trusts the degree.
The number of hours of hands-on programming in a degree (probably only a couple of hundred a year) are too few for a graduate to learn anything but the basics.
Most professional programmers would put in more hours of hands on programming in a months work.
I chose to avoid computer science degrees because in those days, in the late 90's, it was pretty much 80-90% math classes. I don't regret getting a degree in a totally different discipline: it turns out learning philosophy, history, linguistics, writing, etc. etc. was far useful to me as a person (and, arguably, a worker) than some boring math classes ever would have been.
Have comp sci curriculums improved in that time? Given that job interviews these days last 4-8 hours, require group approval and whiteboarding in front of an audience, I would argue, NO. Obviously, no one trusts the degree.