I think you'd find that most people doing large number math in their head is also off by a few percent like this model.
Sure, with pen and paper we can follow specific algorithms manually to very slowly get a precise result. If we wanted a computer to merely follow instructions, then I suspect that there are better ways...
You’re really lowering the bar for success here. It’s now unreasonable for a computer to correctly add two numbers together? Give me a break. It wasn’t even reasonable for a Pentium chip to incorrectly divide two numbers back in 1994.
Sure, with pen and paper we can follow specific algorithms manually to very slowly get a precise result. If we wanted a computer to merely follow instructions, then I suspect that there are better ways...