Skip to main content

Hallucination

When an AI model generates false or nonsensical information that sounds plausible. A common challenge in LLMs that needs to be mitigated through various techniques.

Related Terms

Explore More Terms

Browse Full Glossary