Speaking of which... when people talk about "replacing" humans with AI, it makes me wonder if there's some kind of law we can push for that says "if you are part of the chain of command that signs off on AI being able to make final determinations, and that causes legal issues, you will be legally liable in place of the AI, since computers cannot be liable." Let a jury decide who, in the chain, bears what burden, case by case, but provide for prima facie liability for all parties in the chain, when a valid suit is tried. I want to see how strong the push is for AI when it's the CEO's personal money on the line.
The chain of responsibility must include the AI vendor. If vendors aren't liable for malpractice, there will be less incentive for all due diligence when lives are on the line.
Honestly yes, you are 100% right that it should be a responsibility thing. I remember back in the day it was said that self-driving car companies would have legal responsibility in case of an accident. I remember that kind of put a damper on the rollout and also took a lot of hype and focus away from the whole industry.