There's zero ability to compute problems. Only statistical word salad. AI is a scam.
LLMs (I refuse to call them AI, as there's no intelligence to be found) are simply random word sequence generators based on a trained probability model. Of course they're going to suck at math, because they're not actually calculating anything, they're just dumping what their algorithm "thinks" is the most likely response to user input.
"The ability to speak does not make you intelligent" - Qui-Gon Jin
Why can't this person working on his linguist degree not able to do high level math? It's not their specialty.
To be fair, even someone with a linguist degree knows basic math. GPT can't even get that right. That's the biggest problem (and red flag).
GPT can’t get basic math right? That hasn’t been my experience whatsoever.
It can absolutely do basic math. Give us a ‘basic’ math question that it can’t solve.
Autocorrect that can program?
Programming languages are structured and have rigid syntax that fits well in a LLM model, so it spitting out working code for simple things is like having a sentence that is structured like a normal person.
The code might not do what you are actually trying to do, or might work while being inefficient, even if it runs.
Ive heard the chatgpt math problem was fixed in the new one by having it write a python code to complete the math problem and then providing the answer when the code is run.
Furry Technologists
Science, Technology, and pawbs