Top Qs
Timeline
Chat
Perspective
Receptron
Neuromorphic data-processing model From Wikipedia, the free encyclopedia
Remove ads
The receptron (short for "reservoir perceptron") is a neuromorphic data processing model — specifically neuromorphic computing — that generalizes the traditional perceptron, by incorporating non-linear interactions between inputs.[1][2][3] Unlike classical perceptron, which rely on linearly independent weights, the receptron leverages complexity in physical substrates,[4] such as the electric conduction properties of nanostructured materials or optical speckle fields, to perform classification tasks.[5][6] The receptron bridges unconventional computing and neural network principles,[7] enabling solutions that do not require the training approaches typical of artificial neural networks based on the perceptron model.[8]
Remove ads
Algorithm
Summarize
Perspective
The receptron is an algorithm for supervised learning of binary classifiers, so a classification algorithm that makes its predictions based on a predictor function, combining a set of weights with the feature vector[9]. The mathematical model is based on the sum of inputs with non-linear interactions:
(1)
where and are non-linear weight functions depending on the inputs, . Nonlinearity will typically make the system extremely complex, and allowing for the solution of problems not solvable through the simpler rules of a linear system, such as the perceptron or McCulloch Pitts neurons, which is based on the sum of linearly independent weights[10]:
(2)
where are constant real values. A consequence of this simplicity is the limitation to linearly separable functions, which necessitates multi-layer architectures and training algorithms like backpropagation[11]
As in the perceptron case[12], the summation in Eq. 1 origins the activation of the receptron output through the thresholding process,
(3)
where th is a constant threshold parameter. Equation 3 can be written by using the Heaviside step function.
The weight functions can be written with a finite number of parameters , simplifying the model representation. One can Taylor-expand and use the idempotency of Boolean variables such that can be written as
(4)
where are independent parameters that can be seen as the components of a tensor (“weight tensor”) of rank and type .
The sum in Eq. [3] reduces to the perceptron case when off-diagonal terms of vanish. If one considers , one gets:
(5)
in the perceptron case, the vanishing of implies linearity . In the receptron case , meaning that the superposition principle is no longer valid, the latter terms being responsible of the more complex non-linear interaction between the inputs.
Remove ads
Design and implementations
1. Electrical Receptron
Substrate: Nanostructured and nanocomposite films (Au, Pt, Zr Au/Zr). These films form disordered networks of nanoparticles with resistive switching and non-linear electrical conduction.
2. Optical Receptron
Substrate: Optical speckle fields generated by random interference of light emerging from a disordered medium illuminated by a laser or coherent radiation[13].
Key features
Physical Substrate Computing: The receptron does not require digital training; instead, it exploits the natural complexity of materials (e.g., nanowire networks, diffractive media) to perform computations.
Non-Linear Separability: Unlike traditional perceptrons, which fail on problems like the XOR function, the receptron can solve such tasks due to its inherent non-linearity.
Training-Free Operation: Classification is achieved through the physical system's response rather than iterative weight adjustments, reducing computational overhead.
References
Wikiwand - on
Seamless Wikipedia browsing. On steroids.
Remove ads