大模型幻觉
Chinese
Slang
Chinese
★★★★ 4/5
casual
dà mó xíng huàn jué
Pinyin
dà mó xíng huàn jué
Hanzi breakdown
大模型 (large model) + 幻觉 (hallucination) -> false AI-generated claim.
Meaning
A hallucination from a large AI model: fluent output that is false or unsupported.
The phrase is used in AI literacy and product discussions. It warns that confident language does not guarantee factual accuracy.
Examples
- 这段引用像大模型幻觉。 This quote sounds like a large-model hallucination.
- 大模型幻觉需要人工核查。 Large-model hallucinations need to be checked by a person.
- 别把大模型幻觉当权威答案。 Don't treat a large-model hallucination as an authoritative answer.
Usage Guide
Context: AI literacy, work, product testing
Tone: cautionary, technical
Do Say
- 这段引用像大模型幻觉。
- 大模型幻觉需要人工核查。
Don't Say
- Do not call every bad answer 大模型幻觉; it specifically means unsupported fabricated output.
Common Mistakes
- Do not call every bad answer 大模型幻觉; it specifically means unsupported fabricated output.
Origin & History
Adapts the English AI term “hallucination” into Chinese as 幻觉 for large models.
Cultural Context
Era: 2020s
Generation: Tech users, creators, office workers, and startup communities
Social background: Urban professionals, students, and digital-product users
Regional notes: Used across Mainland China in tech, creator, and workplace contexts.
Related Phrases
Practice this on WordLoci
Flashcards, quizzes, audio pronunciation and spaced repetition