It's hard to not be cynical when you read stuff like "Envy is like drinking poison expecting the other person to die." which is like 2 different inspirational quotes taped together, I heard this one about hatred rather than envy.
The positive side of wanting what others have is called inspiration or something like that, "you inspire me to work hard so I can be like you" etc, nobody calls that envy.
Envy is purely negative in a social sense, it makes you fight people just because they have stuff you want. It is beneficial to your own genes but not beneficial to humanity.
My understanding is that good diet program means that you don't actually want to eat more. Where as what is modern diet seems to do exact opposite off.
Honestly I've been using Firefox for more than 10 years and never felt the need to switch. Every time I try a new browser there's features from Firefox that I end up missing.
And seeing the shitshow with the ad blockers in Chrome there's no way I'm trying that lol.
But openness about mental health issues and lower stigma around it is definitely a more modern thing, actually. I don't doubt that some subsets of the population got more depressed while others feel better, but it's very possible that rates would have been the same 30 years ago, had people felt okay with talking about suicidal thoughts, depression or a variety of other things that are at least a bit less stigmatised now. I'm not that old (or at least I like to think I'm not) but I can say with confidence nobody in my circle of friends 25 years ago would even think of saying they're depressed or suicidal. That would get you labelled a weirdo.
I'll try to apply this as I've been battling the problem of my job being too easy and never applying what I learnt in school.
There were so many interesting problems and theories in school that you never see as a regular developer, often I think that I have a job that any person could do with a few months of training (does this make me a narcissist? Is the complain here just that my job is unglamorous or is wasting your skills a legit concern?). I guess I haven't matured enough, or maybe I'm not really sold into the impact my company is making.
Maybe not a few months, but students spend years and I imagine 70%+ of them can train just fine to any entry or mid level work. But companies are inherently risk averse and no one really wants to risk hitting that 30% snag
I train software engineers with the apprentice model, and it takes about 500 hours to train a person to be able to get hired as an entry level full stack engineer. Maybe not in today's market where even talented engineers are struggling. But over the last sixteen years I've trained sixteen engineers who all still work in the industry.
I use TDD, a list of projects that gradually increase in difficulty, multiple languages, and solo practice on tools like CodeWars. After they are comfortable with basics I'll have them build a personal project of their own design. Between CodeWars and TDD they've now gotten used to getting dopamine hits from programming. Add in intrinsic motivation on a personal project with a full stack website and they are glued to the editor. Time flies as they rapidly achieve a decent level of skill at reading and writing code. By the time they are wrapping up that personal project, as long as they've kept up with algorithm practice they're ready to interview.
Some have taken longer, usually about a third take a break and circle back around later. Some do take more like 1000 hours to feel confident. Sometimes I end up hiring them myself for contract work to help them build a resume and help me with a project. But on average it's only about 500 hours to teach the fundamentals of unit testing, algorithms, data structures, functional programming, SQL, Typescript, React, and a backend language like Python or C#. They'll not be a master of any of them, but know enough to build a website with persistence and use tools like Google and ChatGPT to get answers when stuck.
The "simple trick" is just making them write absolute tons of code, every day, for months. Then making them read other's code to learn new tricks. It's not easy, I'll work them like a rented mule but they'll come out way over qualified for an entry level position. By the time they're interviewing most will have written 10-20k lines of logic, often grinding a problem a half dozen ways to really see the pros and cons of various solutions.
I think most people greatly overestimate how much code the average CS graduate actually writes for their degree. I've interviewed hundreds of new grads who've written maybe a few hundred lines of code total. Sure they know the theory of computer science, but aren't prepared at all to be software engineers.
I'm not saying the bootcamp model is better, generally those students only know basics of copy paste. This is entirely the fault of the bootcamp instructors who are just trying to churn out graduates.
There just isn't much effort in actually teaching software engineering. Bootcamps have students do some basics and stop way before they're ready. Universities just have them study theory. Rarely is anyone teaching how to read and write code so fluently the students are dreaming in code. But this is extremely effective.
I’m quite intrigued by the number of SWEs you’ve trained through an apprenticeship. I think there’s often this tension between those who have been trained at some university and now have a CS degree and those who have “learned on the job” or by some other means. I think that tension also exists elsewhere in other careers - it is not unique to software.
I’d like to think the reality is more nuanced and complicated. Everyone has their preferences for learning. But I wish apprenticeships weren’t so undervalued and under appreciated. This is how we often trained specialized experts for thousands of years. I don’t mean to suggest it doesn’t have its own problems, but I think if it became more normalized it would be an overall positive change.
There definitely is that kind of tension between the two camps. Generally though it's safe to assume a CS degree is optimized to prepare a student to become a CS professor or researcher. A CS graduate typically seems to graduate about 2/5ths of the way through my program, needing 300 extra hours to be at the same level as my graduates. This is not true for all of them, just most of the ones I've interviewed. There's definitely been some though who are way ahead of my students, usually because they went to an extraordinary university and also put in a lot of extra time outside of their coursework.
It's not cheap though, I do it as a charity. I've recently started my own business, so as of this summer I'm able to actually pay them which is epic. Still the program as a whole operates at a loss. So I do it because I want to pay it forward.
What is the application for the training? I don't need payment or job placement but damn if my software engineering abilities don't need a boost (EE/physics background)
I don't understand what are these immense benefits, to catch more criminals? Since when has throwing more people in jail reduced crime?
I guess this would stop someone who is plotting a murder but a school shooting is gonna happen either way.
I don't know about "throwing people in jail", but putting criminals in jail certainly reduces crime rates. As opposed to not putting convicted criminals in jail.
But we hardly need advanced DNA profiling to catch 99.9% of criminals (versus just standard DNA matching).
> I don't know about "throwing people in jail", but putting criminals in jail certainly reduces crime rates.
That's the sort of statement that seems plausible, and even intuitive, but probably needs a citation. It wouldn't wholly surprise me if it were true, though at moral and economic cost; but it would surprise me even less if it were false.
It’s literally self evident that a person in jail can’t commit crimes on the outside. The statement itself contains all the axioms you need, citations are not something required here. It’s like saying a dead baker reduces the amount of bread in a town for that day and you asking for a source
> It’s literally self evident that a person in jail can’t commit crimes on the outside. The statement itself contains all the axioms you need, citations are not something required here. It’s like saying a dead baker reduces the amount of bread in a town for that day and you asking for a source
You didn't say "reduces the crime rate outside of prison." I assumed that's what you meant, but it's not clear that ignoring the crime rate inside prison is a reasonable statistic.
People in prison also, presumably, eventually get out, and a claim that prison officials can accurately deduce the likelihood of recidivism, and whether it has been decreased rather than increased by time in prison, is far from clear.
Finally, putting lots of people in prison has an effect on people outside of prison. For example, it is possible—though, again, I don't know; citations are needed—that high incarceration rates lead to more crime outside, since, if a member of a community has a good chance of going to prison whether or not they commit a crime, then prison can cease to have a meaningful deterrent effect in that community.
Well if the data is being sold to drug companies, one massive benefit that is glarinly obvious is that drug companies may mow have an enormously valuable dataset for developing new medical technologies.
Im as anti-dna info-sharing as anybody, and I wont begiving 23andme a sample ever, but this is admitedly probably a pretty good thing. Even if it does ultimately serve to enrich some mega corps,consumers will probably get some amazing new treatments/therapies/medicines out of the deal.