Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I beg to disagree, for example finer gate size in microchips requires vastly more investment with each iteration. See: http://en.wikipedia.org/wiki/Rock%27s_law


If you look at the semiconductor industry as a whole, it will eventually start to become less capital intensive because (at some point in the future) the vast majority of chips produced will be old generation technology that is "good enough".

There will still be companies trying to push miniscule improvements, but eventually those incremental improvements won't be enough to justify using them over older cheaper technology (for most use cases). We're already getting there by the way-- just look at the number of cheap embedded CPUs with relatively large transistors produced each years.

Say we get to the point, call it x nanometers, where we can't really get any smaller because we've reached physical limitations. 50 years after what will the cost to produce an x nm chip be compared to 50 years before--even if we've moved beyond silicon 50 years later there will still be countless applications capable of being performed by older cheaper technology.


You are singling out the one thing that allows everything else to shift to software (ie, non-capital). Rock's Aphorism is currently true, but always? I measly mortal can still get chips made at said plants.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: