# Residual neural network

## Deep learning method / From Wikipedia, the free encyclopedia

#### Dear Wikiwand AI, let's keep it short by simply answering these key questions:

Can you list the top facts and stats about Residual neural network?

Summarize this article for a 10 year old

A **residual neural network** (also referred to as a **residual network** or **ResNet**)^{[1]} is a deep learning model in which the weight layers learn residual functions with reference to the layer inputs. It behaves like a highway network whose gates are opened through strongly positive bias weights.^{[2]} This enables deep learning models with tens or hundreds of layers to train easily and approach better accuracy when going deeper. The identity skip connections, often referred to as "residual connections", are also used in the 1997 LSTM networks,^{[3]} Transformer models (e.g., BERT, GPT models such as ChatGPT), the AlphaGo Zero system, the AlphaStar system, and the AlphaFold system.

This article may require cleanup to meet Wikipedia's quality standards. The specific problem is: article is not written in an encyclopaedic tone and suffers from many failures to follow simple style guidelines. (February 2024) |

Residual networks were developed by Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun, who won the 2015 ImageNet competition.^{[4]}^{[5]}