LLMs Have No Clue About the World

One of the biggest problems with LLMs is that they simply don’t understand the world. As much as they can mimic human language (and hence appear to understand how things relate to each other), they don’t. Here is a prime example – the Mastodon user Kévin asked numerous AI models a deceptively simple question: “I want to wash my car. The car wash is 50 meters away. Should I walk or drive?”

Here are the responses (spoiler: they are all wrong).

Pascal Finette @radical