VibraXX
Live Quiz Arena
🎁 1 Free Round Daily
⚡ Enter Arena
HomeCategoriesScienceQuestion
Question
Science

Which risk increases when a neural network's activation functions saturate?

A)Vanishing gradient during backpropagation
B)Exploding gradients during feedforward
C)Increased model generalization performance
D)Faster convergence during network training

💡 Explanation

Vanishing gradients increase given activation function saturation because the derivative becomes near-zero and prevents learning via gradient descent. Therefore, the backpropagated error signal diminishes, rather than stronger, which stalls training even with powerful hardware.

🏆 Up to £1,000 monthly prize pool

Ready for the live challenge? Join the next global round now.
*Terms apply. Skill-based competition.

⚡ Enter Arena

Related Questions

Browse Science