/
Search

Vanishing gradient emerged earlier than degradation. Vanishing gradient is the phenomenon that with the increase of the neural network layers, the gradient become too small to allow network evolve. This problem is largely addressed by renormalisation and ReLU activation function. After people tackled that problem and then tried to make neural network deeper, the degradation problem emerged. People found that with deeper layers, network gets higher error rate. ResNet is introduced to address that problem and got big success.

출처
수집시간
2021/08/10 12:06
연결완료
1 more property
굉장히 깔끔하고 잘 쓰여진 글 아닌가