Fri. Nov 8th, 2024

The Cambridge Dictionary named ‘hallucinate’ its word of the year — and it captures the AI industry’s biggest challenge<!-- wp:html --><p>Chatbots can "hallucinate," a phenomenon where they convincingly spit out inaccuracies as if they were true.</p> <p class="copyright">da-kuk/Getty Images</p> <p>The Cambridge Dictionary is updating the definition of the word "hallucinate" because of AI.Hallucination is the phenomenon where AI convincingly spits out factual errors as truth.It's a word that also captures one of the AI industry's key challenges: misinformation.</p> <p>The Cambridge Dictionary's newly crowned word of the year is a familiar one, but it's taking on a new meaning because of AI.</p> <p>On November 15, the organization announced that "hallucination" would take on a <a target="_blank" href="https://www.cam.ac.uk/research/news/cambridge-dictionary-names-hallucinate-word-of-the-year-2023" rel="noopener">new definition</a> beyond just seeing or hearing something that does not exist. The word's entry in the dictionary will be amended to include:</p> <p>When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information.</p> <p><a target="_blank" href="https://www.businessinsider.com/ai-chatbot-chatgpt-google-microsofty-lying-search-belief-2023-2" rel="noopener">Hallucination</a> is a widely used term within the AI industry, and it refers to incidents in which AI convincingly spits out inaccuracies as though they were truth — sometimes with damaging consequences.</p> <p>News outlets including <a target="_blank" href="https://www.businessinsider.com/gizmodo-goof-proves-ai-errors-basic-facts-star-wars-chronology-2023-7" rel="noopener">Gizmodo</a>, <a target="_blank" href="https://www.businessinsider.com/tech-site-issued-corrections-after-ai-writing-got-facts-wrong-2023-1" rel="noopener">CNET</a>, and <a target="_blank" href="https://www.businessinsider.com/microsoft-ai-written-article-recommend-food-bank-tourist-attraction-2023-8" rel="noopener">Microsoft</a> have all landed in hot water over errors found in their AI-written articles. A lawyer <a target="_blank" href="https://www.businessinsider.com/young-lawyer-fired-using-chatgpt-job-work-hallucinations-errors-2023-11" rel="noopener">told Insider</a> on Friday that he was fired for using ChatGPT to help him improve a motion, after the chatbot made up non-existent lawsuits as citations.</p> <p>In February, <a target="_blank" href="https://markets.businessinsider.com/news/stocks/chatgpt-ai-mistakes-hallucinates-wrong-answers-edge-computing-morgan-stanley-2023-2" rel="noopener">Morgan Stanley</a> analysts wrote that one of ChatGPT's key shortcomings is how it occasionally makes up facts. It's an issue the analysts said they expect will persist for "the next couple of years." <a target="_blank" href="https://markets.businessinsider.com/news/stocks/mark-cuban-chatgpt-openai-artificial-intelligence-chatbot-worsen-online-misinformation-2023-2?utm_medium=ingest&utm_source=markets" rel="noopener">Business leaders</a> and <a target="_blank" href="https://www.businessinsider.com/ai-scam-spam-hacking-ruining-internet-chatgpt-privacy-misinformation-2023-8" rel="noopener">misinformation experts</a> have also voiced their concerns over how AI might worsen the state of online misinformation.</p> <p>"The fact that AIs can 'hallucinate' reminds us that humans still need to bring their critical thinking skills to the use of these tools," wrote Wendalyn Nichols, the publishing manager of the Cambridge Dictionary, in the group's announcement of its changes to the dictionary.</p> <div class="read-original">Read the original article on <a href="https://www.businessinsider.com/hallucinate-cambridge-dictionary-word-of-the-year-artificial-intelligence-2023-11">Business Insider</a></div><!-- /wp:html -->

Chatbots can “hallucinate,” a phenomenon where they convincingly spit out inaccuracies as if they were true.

The Cambridge Dictionary is updating the definition of the word “hallucinate” because of AI.Hallucination is the phenomenon where AI convincingly spits out factual errors as truth.It’s a word that also captures one of the AI industry’s key challenges: misinformation.

The Cambridge Dictionary’s newly crowned word of the year is a familiar one, but it’s taking on a new meaning because of AI.

On November 15, the organization announced that “hallucination” would take on a new definition beyond just seeing or hearing something that does not exist. The word’s entry in the dictionary will be amended to include:

When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information.

Hallucination is a widely used term within the AI industry, and it refers to incidents in which AI convincingly spits out inaccuracies as though they were truth — sometimes with damaging consequences.

News outlets including Gizmodo, CNET, and Microsoft have all landed in hot water over errors found in their AI-written articles. A lawyer told Insider on Friday that he was fired for using ChatGPT to help him improve a motion, after the chatbot made up non-existent lawsuits as citations.

In February, Morgan Stanley analysts wrote that one of ChatGPT’s key shortcomings is how it occasionally makes up facts. It’s an issue the analysts said they expect will persist for “the next couple of years.” Business leaders and misinformation experts have also voiced their concerns over how AI might worsen the state of online misinformation.

“The fact that AIs can ‘hallucinate’ reminds us that humans still need to bring their critical thinking skills to the use of these tools,” wrote Wendalyn Nichols, the publishing manager of the Cambridge Dictionary, in the group’s announcement of its changes to the dictionary.

Read the original article on Business Insider

By