Neural networks curve fitting

This page presents a neural network curve fitting example. This example shows and details how to create nonlinear regression with TensorFlow.

The following has been performed with the following version:

Try the example online on Google Colaboratory.

Problem definition

The goal of this example is to approximate a nonlinear function given by the following equation:

$$ y = 0.1.x.\cos(x) $$

The blue dots are the training set, the red line is the output of the network:

Source code

Each line is explained in the next section. Source code and example can be run online on Google Colaboratory

Explanation

First, we import the libraries:

import numpy as np
import matplotlib.pyplot as plt
from tensorflow import keras
from google.colab import files
import tensorflow as tf
import math

Then, we create the training data. x_data composed of 1000 points, and normal noise is added to the y-coordinate of each point:

# Create noisy data
x_data = np.linspace(-10, 10, num=1000)
y_data = 0.1*x_data*np.cos(x_data) + 0.1*np.random.normal(size=1000)
print('Data created successfully')

Here is the training set:

Training dataset for the neural network nonlinear regression

Once our training dataset is built, we can create our network:

RELU is probably not the best choice for this application, but it works fine. ELU should provide smotther results.

# Create the model 
model = keras.Sequential()
model.add(keras.layers.Dense(units = 1, activation = 'linear', input_shape=[1]))
model.add(keras.layers.Dense(units = 64, activation = 'relu'))
model.add(keras.layers.Dense(units = 64, activation = 'relu'))
model.add(keras.layers.Dense(units = 1, activation = 'linear'))
model.compile(loss='mse', optimizer="adam")

# Display the model
model.summary()

The model is compiled with the following optimization parameters:

Once the model is defined, let's train our network:

# Training
model.fit( x_data, y_data, epochs=100, verbose=1)

It should display something like (loss should decrease):

Train on 1000 samples
Epoch 1/100
1000/1000 [==============================] - 0s 321us/sample - loss: 0.2125
Epoch 2/100
1000/1000 [==============================] - 0s 49us/sample - loss: 0.1914
Epoch 3/100
1000/1000 [==============================] - 0s 50us/sample - loss: 0.1932
Epoch 4/100
1000/1000 [==============================] - 0s 60us/sample - loss: 0.1922

...

Epoch 97/100
1000/1000 [==============================] - 0s 59us/sample - loss: 0.0180
Epoch 98/100
1000/1000 [==============================] - 0s 53us/sample - loss: 0.0188
Epoch 99/100
1000/1000 [==============================] - 0s 54us/sample - loss: 0.0161
Epoch 100/100
1000/1000 [==============================] - 0s 55us/sample - loss: 0.0147

Once trainning is over, we can predict and display the output for each input:

# Compute the output 
y_predicted = model.predict(x_data)

# Display the result
plt.scatter(x_data[::1], y_data[::1])
plt.plot(x_data, y_predicted, 'r', linewidth=4)
plt.grid()
plt.show()

Here is the result:

nonlinear regression results with TensorFlow

Source code

You can try this example online on Google Colaboratory

See also


Last update : 09/02/2022