Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

if you consider something like Intel copying the RISC microarchitectures of the 90s and essentially driving the likes of Compaq, DEC and MIPS out of business, allowing those ideas to copied was almost definitely bad for consumers.

I don't understand how you can argue this was bad for consumers when it made fast CPUs so much cheaper. Okay, it stuck us with the x86 architecture, which wasn't so great for programmers, you could argue. But how was this a problem for consumers?

I think the reality is quite the contrary: if Intel hadn't done that (and BTW it wasn't just Intel), modern consumer-grade machines would be 1/10 as powerful, or less. The kind of power I now have in my laptop would be confined to $20k workstations.



One reason it was bad was because all these other guys died.

If you remember back to 2000 or so, Intel had killed all their competitors, bought Alpha and then threw it way were trying two rather briandead architectures in the P4 and the Itanium. They were able to get away with it because essentially they had no competition. AMD saved us from this with the Greyhound and x86-64 and there's no guarantee that Intel would've done anything similar if not for them.

I think the reality is quite the contrary: if Intel hadn't done that (and BTW it wasn't just Intel), modern consumer-grade machines would be 1/10 as powerful, or less. The kind of power I now have in my laptop would be confined to $20k workstations.

The reason you have that kind of performance increase over the last 30 years or so is because of CMOS scaling and let's not kid ourselves into thinking that this is driven by Intel. CMOS scaling was predicted by IBM's Bob Dennard back in the 70s and all we've done since then is pretty much just follow his roadmap.

It's not like computers weren't getting cheaper and faster before Intel became the dominant monopoly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: