Live Quiz Arena
🎁 1 Free Round Daily
⚡ Enter ArenaQuestion
← Language & CommunicationWhy does a chatbot utilizing a Markov chain exhibit incoherence more readily than one using a transformer network?
A)Markov chains use lexical semantics
B)Transformers manage long-range dependencies better
C)Markov chains lack long-range context✓
D)Transformers ignore surface level features
💡 Explanation
Markov chains determine the next word based only on the preceding n words, rather than considering the overall context; therefore, coherence suffers due to the lack of long-range dependencies. Transformers use attention mechanisms to relate words across the entire input, allowing better coherence rather than just local relationships.
🏆 Up to £1,000 monthly prize pool
Ready for the live challenge? Join the next global round now.
*Terms apply. Skill-based competition.
Related Questions
Browse Language & Communication →- Why does phonetic adaptation by L1 speakers degrade intelligibility in L2 writing systems with shallow orthography?
- Why does pronunciation variation persist within a language community despite mass media exposure?
- Why does 'minimal pair' identification in signed languages, such as American Sign Language (ASL), prove more challenging than in spoken languages?
- A social media platform restricts character count; which mechanism promotes emoji adoption in informal communication?
- An engineer localizing an educational video game for younger children chooses to simplify cultural references and gameplay mechanics. Which outcome is most likely from this design choice?
- Why does emoji usage differ significantly across online platforms, leading to potential misinterpretations?
