Perceptron

Kinder Chen
2 min readOct 8, 2021

An ANN (artificial neural network) is a Machine Learning model inspired by the networks of biological neurons found in the brains. An artificial neuron has one or more binary on/off inputs and one binary output. The artificial neuron activates its output when more than a certain number of its inputs are active.

The Perceptron is one of the simplest ANN architectures. It is based on a slightly different artificial neuron called a threshold logic unit (TLU). The inputs and output are numbers instead of binary on/off values, and each input connection is associated with a weight. The TLU computes a weighted sum of its inputs, then applies a step function to that sum and outputs the result. A Perceptron is simply composed of a single layer of TLUs with each TLU connected to all the inputs. When all the neurons in a layer are connected to every neuron in the previous layer, i.e., its input neurons, the layer is called a fully connected layer, or a dense layer. The inputs of the Perceptron are fed to special passthrough neurons called input neurons. All the input neurons form the input layer. When the artificial neurons are TLUs, the activation function is a step function.

When a biological neuron triggers another neuron often, the connection between these two neurons grows stronger. The connection weight between two neurons tends to increase when they fire simultaneously, which is known as Hebb’s rule or Hebbian learning. Perceptrons are trained using a variant of this rule that takes into account the error made by the network when it makes a prediction; the Perceptron learning rule reinforces connections that help reduce the error. More specifically, the Perceptron is fed one training instance at a time, and for each instance it makes its predictions. For every output neuron that produced a wrong prediction, it reinforces the connection weights from the inputs that would have contributed to the correct prediction.

--

--

Kinder Chen

What happened couldn’t have happened any other way…