I understand almost nothing of the technology behind ChatGPT but even to me it seems obvious that a model designed for natural language processing should not be very good at simple calculation with large numbers – something which is never done in natural language.
Virtually all coverage of ChatGPT, including coverage by interested domain experts and educated fans, prefers to assume that ChatGPT is a person you can talk to through your computer, not a text engine.
Humans learn math through natural language and symbols.
Is there any indication that it is a blocker for models to learn math.
I don't necessarily think pumping more data into ChatGPT will make it understand. But I think it's possible to teach a model to do math through natural language.
Perhaps GPT-like models are already capable enough to do math, but they need to store what we call mathematical reasoning as one of many distinct processing pathway and tap into it whenever the context is appropriate.
Easy to say obviously but there's some promising work in this direction[1]