In this example, we want to approximate the following scatter plot with a single layer neural network. Blue points are the training set given by an input \( x_i \) and an expected output \( y′_i \). The red line is the output of the network \( y=f(x) \) after training.

The following perceptron will be used for the single layer network:

\( x \) is the input. The activation function is given by \( f(x)=x \).

As explained on the previous page, the weights will be updated according to this formula:

$$ w_i'= w_i + \eta(y'-y)x_i $$

Let's detail for each weight \( w_1 \) and \( w_2 \):

$$ w_1'= w_1 + \eta(y'-y)x $$ $$ w_2'= w_2 + \eta(y'-y) $$

- Neural networks curve fitting
- Datasets for deep learning
- Gradient descent example
- How popular are neural networks over the years?
- Install TensorFlow and Keras for Linux
- Learning rule demonstration
- Most popular activation functions for deep learning
- Most relevant deep learning research papers
- Neural Network Perceptron
- Simplest neural network with TensorFlow
- Simplest perceptron
- Single layer training algorithm
- Single layer classification example
- Gradient descent for neural networks
- Single layer limitations
- Neural networks

Last update : 01/30/2021