Spend enough time with ChatGPT and other artificial intelligence chatbots and it doesn’t take long for them to spout falsehoods.
By Matt O’Brien
Spend enough time with ChatGPT and other artificial intelligence chatbots and it doesn’t take long for them to spout falsehoods.
Described as hallucination, confabulation or just plain making things up, it’s now a problem for every business, organization and high school student trying to get a generative AI system to compose documents and get work done. Some are using it on tasks with the potential for high-stakes consequences, from psychotherapy to researching and writing legal briefs.
‘I don’t think that there’s any model today that that doesn’t suffer from some hallucination,’ said Daniela Amodei, co-founder and president of Anthropic, maker of the chatbot Claude 2.
‘They’re really just sort of designed to predict the next word,’ Amodei said. ‘And so there will be some rate at which the model does that inaccurately.’
Anthropic, ChatGPT-m...
Want to see the rest of this article?
Would you like to see the rest of this article and all the other benefits that Issues Online can provide with?
- Useful related articles
- Video and multimedia references
- Statistical information and reference material
- Glossary of terms
- Key Facts and figures
- Related assignments
- Resource material and websites