Live Quiz Arena
🎁 1 Free Round Daily
⚡ Enter ArenaQuestion
← ScienceWhich risk increases when a neural network's activation functions saturate?
A)Vanishing gradient during backpropagation✓
B)Exploding gradients during feedforward
C)Increased model generalization performance
D)Faster convergence during network training
💡 Explanation
Vanishing gradients increase given activation function saturation because the derivative becomes near-zero and prevents learning via gradient descent. Therefore, the backpropagated error signal diminishes, rather than stronger, which stalls training even with powerful hardware.
🏆 Up to £1,000 monthly prize pool
Ready for the live challenge? Join the next global round now.
*Terms apply. Skill-based competition.
Related Questions
Browse Science →- Which change occurs to spacetime around rotating black holes?
- Which mechanism causes reduced power transmission efficiency in long underwater power cables?
- Within a Haber-Bosch reactor, which consequence results when temperature decreases, despite unchanging rate constants?
- Which outcome results when gravitational potential intensifies near neutron star?
- Which outcome accelerates when a radioactive isotope exceeds criticality?
- Which outcome occurs when insufficient binding energy exists within nucleons?
