Only Morons Use ChatGPT As a Replacement for Google

Recently in a Reddit Q&A thread, someone asked about the exact heel height of a discontinued weightlifting shoe. “I’ve looked everywhere, but even ChatGPT doesn’t know,” I remember them saying. It’s true that the information was hard to come by, but that’s another reason not to use ChatGPT. If you can’t test the bot, the answers it gives are useless.

As we explained earlier, ChatGPT is a text generator . He doesn’t “know” anything and makes no guarantee that whatever he says is correct. In fact, many of the things he says are provably wrong. We’ve seen him compose exercises that are physically impossible , and he told Lifehacker writer Steven Johnson that he wrote specific articles that weren’t really his writing at all. The AI ​​can “hallucinate” facts and double them when pressed.

For example, I asked who Beth Skwarecki was. He correctly named my position and bit, and he knows that I write for Lifehacker, but keeps trying to award me a master’s degree in public health. (A probable guess, but no.) If you repeat the answer multiple times, it will suggest different universities where I supposedly got this degree. None of them is the university I have ever attended.

In other words, you can’t tell by the way the text looks whether an AI-generated fact is true; it is designed to look believable and correct. You must check this for facts. If you could get ChatGPT to tell you that a certain weightlifting shoe has a standard 3/4″ heel, that might certainly sound like it might be correct, but if you can’t find that information elsewhere place, you can’t check it – so you’re wasting your time.

ChatGPT is not a search engine

Now there is such a thing as artificial intelligence search engines. This is how Bing Chat works : it does the actual search for information and then uses AI to format that information into understandable text. For every factual thing he tells you, you can click on his source to see where that information actually came from. But other AI chatbots, including ChatGPT, are built differently.

Yet ChatGPT is sometimes positioned as an alternative to search engines. Check out this Guiding Tech article that praises it for not making you wade through search results pages, or this CNBC article that ranks it better than Google in providing drug safety information. (Oh my god, don’t use ChatGPT for medical advice.) But it doesn’t really do the same job as a search engine and can’t be used as such.

For example, car enthusiast Chris Paukert tweeted that he received a fact-checking email from his car quote. The marketer who sent the email said they “found” the quote via ChatGPT and wanted to make sure it was real. It’s good they checked, because it wasn’t something Paukert ever said or wrote. But why do they even think that a text generator is a good place to “find” quotes?

Yes, ChatGPT has been trained on a huge amount of data (sometimes described as “the entire internet”, though that’s not entirely true), but that just means it’s seen the facts. There is no guarantee that he will use these facts in his answers.

Using myself as another example, I asked the bot to name the books I had written. It listed five books, four of which were real but not mine, and one didn’t exist at all. I think I know what happened there: it knows that I am an editor and that I write about fitness. As such, I have been credited with books whose authors include “editors” of fitness magazines such as Runner’s World or Men’s Health.

So if you want to use ChatGPT for ideas or brainstorming for more information, great. But don’t expect him to base his answers on reality. Even for something as innocuous as recommending books based on your favorite books, there will likely be books created that don’t even exist .

More…

Leave a Reply