Live Quiz Arena
🎁 1 Free Round Daily
⚡ Enter ArenaQuestion
← Language & CommunicationWhy does a neural machine translation system degrade in accuracy when processing sentences substantially longer than those in the training corpus?
A)Reduced attention span in encoder
B)Vanishing gradient in recurrent units
C)Overfitting to short sentence structures
D)Distributional shift outside n-gram context✓
💡 Explanation
The n-gram context within the training corpus defines the expected relationships; therefore, a distributional shift occurs when the system processes sentences significantly longer than those it was trained on, leading to a decrease in accuracy because the system extrapolates beyond its learned n-gram patterns, rather than due to overfitting or gradient issues.
🏆 Up to £1,000 monthly prize pool
Ready for the live challenge? Join the next global round now.
*Terms apply. Skill-based competition.
Related Questions
Browse Language & Communication →- Why does informal internet language often omit function words (e.g., 'to,' 'of,' 'the') in text messages?
- A radio transmitter encodes speech using Huffman coding. If unpredictable channel noise corrupts encoded data, which consequence follows?
- A novelist aims to evoke catharsis in readers; which constraint most directly compromises this goal through unreliable narration?
- Why does cross-linguistic semantic ambiguity pose a challenge to machine translation?
- In asynchronous computer-mediated communication, which mechanism explains why emoji usage correlates with higher perceived message ambiguity despite intending to clarify sentiment?
- Why does cross-domain metaphor comprehension in autistic individuals differ from neurotypical individuals when presented with abstract concepts?
