Code inflation nicely cancels out gains from Moore's law. But there's also another interesting phenomenon seen in embedded hardware - it seems that each generation of devices has more (exponentially more?) computing power on-board, while the user-facing functionality stays the same or even slightly degrades. For example, the functionality of today's fridges, kettles and washing machines is equivalent to those made 20 years ago, but today's versions will break down faster and in ways nigh impossible to fix by yourself.
We're about to have Android running on toasters. And I can't stop myself from asking the question - why?
> while the user-facing functionality stays the same or even slightly degrades
I would disagree with that on so many points. Today, more than ever we have massive differences. "Hey Siri/Google", the camera functionality is just incomparable, the maps, ...
In the case of consumer white goods the business case is that expensive mechanical components and security mechanisms are replaced by electronic ones that are cheaper. And indeed, counting inflation, today's whitegoods are far cheaper than they ever were. This is happening in power adapters, but also in washing machines and kettles. This means that half the components only exist in the virtual sense and you'd need half the design, a plastic molding factory, and a master's degree to have any hope in hell of fixing them. But they're 1/10th to 1/5th of your monthly pay, and last 2-10 years, so why bother ?
But the story is the same at a high level for everything from cell phone radios to motor controllers for washing machines. Virtual components, simulated in microcontrollers are far cheaper (and far less repairable) than a real component ever will be.
I bother, because I don't like to throw away a perfectly fine appliance only because some short blew out something on the motherboard. Nor do I want to pay for a new one every 2 - 5 years when the hardware could perfectly well last for 10-20 years. In a perfect world with perfect recycling, I wouldn't mind that much, but as it is today, it's only a way to make me spend more and trash the planet more.
Also, I don't feel like the price of appliances was dropping over my lifetime, so I have to ask - where do those apparent savings go? They're definitely not being passed on to consumers.
> I would disagree with that on so many points. Today, more than ever we have massive differences. "Hey Siri/Google", the camera functionality is just incomparable, the maps, ...
I'll grant you camera, because chips and algorithms do get better. Siri/Google doesn't really feel like that much of an achievement over what was possible 10 years ago, except nobody tried to build that product then, and smartphones weren't exactly popular. As for maps, I'll only point to Google Maps application, which is constantly degrading in quality and functionality for the past 5+ years...
But they're 1/10th to 1/5th of your monthly pay, and last 2-10 years, so why bother ?
2-10 years is an extremely short lifespan for white goods; 20-30 years is more like it, and 40-50 years is not uncommon.
In the case of consumer white goods the business case is that expensive mechanical components and security mechanisms are replaced by electronic ones that are cheaper.
It's not just that, but even mechanical components are deliberately made weaker to use less material and thus cost less. Sometimes manufacturers push that boundary a little too far, Samsung's "exploding" washing machines being one of the latest examples of this.
I think it's partly talent availability. There are lots of android devs, but very few who can write microcontroller code in ASM or super minimal C.
If you can run android on it, you can increase the size of your hiring pool a hundred fold. That matters a lot for big companies trying to ship products at scale.
Of course I don't get why a toaster or a fridge really needs a CPU at all, but that's another matter.
We're about to have Android running on toasters. And I can't stop myself from asking the question - why?