- Authors: Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun
- Published in: CVPR 2016
- Abstract Essence: This paper introduced the Residual Neural Network (ResNet), which implemented residual learning to facilitate the training of networks that are much deeper than those previously used. ResNet architectures, especially the 152-layer network, achieved remarkable performance on the ImageNet dataset and won the 1st place on the ILSVRC 2015 classification task. The introduction of skip connections marked a significant advancement in neural network design.
- Access: Available on arXiv and through the CVPR conference website.
Views: