How about those large swaths are done with LLMs, but instead of spending all that time reviewing that code (really reviewing it, not just a brief LGTM), which would make the time savings moot, you just decide to personally assume responsibility for that code being dead wrong sometimes and the consequences it causes (you cannot blame the AI. As far as anyone is concerned, you wrote the code and signed off on it). As in, legal liability. Would you take the deal?