Attempting to badly quote someone on another post: « How can people honestly think a glorified word autocomplete function could be able to understand what is a logarithm? »
You can make external tools available to the LLM and then provide it with instructions for when/how to use them.
So, for example, you'd describe to it that if someone asks it about math or chess, then it should generate JSON text according to a given schema and generate the command text to parametrize a script with it. The script can then e.g. make an API call to Wolfram Alpha or call into Stockfish or whatever.
This isn't going to be 100% reliable. For example, there's a decent chance of the LLM fucking up when generating the relatively big JSON you need for describing the entire state of the chessboard, especially with general-purpose LLMs which are configured to introduce some amount of randomness in their output.
But well, in particular, ChatGPT just won't have the instructions built-in for calling a chess API/program, so for this particular case, it is likely as dumb as auto-complete. It will likely have a math API hooked up, though, so it should be able to calculate a logarithm through such an external tool. Of course, it might still not understand when to use a logarithm, for example.
If ChatGPT were marketed as a toaster nobody would bat an eye. The reason so many are laughing is because ChatGPT is marketed as a general intelligence tool.
Do you have any OpenAI stuff (ad, interview, presentation...) That claims it's AGI? Because I've never seen such thing, only people hyping it for clicks and ad revenue
It depends. Have you used it? If not - Yes! It does do . . . all the things.
If you have used it, I’m sorry that was incorrect. You simply need to pay for the upgraded subscription. Oh, and as a trusted insider now we can let you in on a secret - the next version of this thing is gonna be, like, wow! Boom shanka! Everyone else will be so far behind!
If llms are statistics based, wouldn't there be many many more losing games than perfectly winning ones? It's like Dr strange saying 'this is the only way'.
It's not even that. It's not a chess AI or a AGI (which doesn't exist). It will speak and pretend to play, but has no memory of the exact position of the pieces nor the capability to plan several steps ahead. For ask intended and porpoises, it's like asking my toddler what's the time (she always says something that sounds like a time, but doesn't understand the concept of hours or what the time is)
The fact that somebody posted this on LinkedIn and not only wasn't shamed out of his job but there are several articles about it is truly infuriating.
It probably consumes as much energy as a family house for a day just to come up with that program. That's what happens.
In fact, I did a Google search and didn't have any choice but to have an "AI" answer, even if I don't want it. Here's what it says:
Each ChatGPT query is estimated to use around 10 times more electricity than a traditional Google search, with a single query consuming approximately 3 watt-hours, compared to 0.3 watt-hours for a Google search. This translates to a daily energy consumption of over half a million kilowatts, equivalent to the power used by 180,000 US households.