This is a basic machine learning program to create a model f(x) = ax + b or y = ax + b from datasets of y as a function of x.
The datasets used are created thanks to the make_regression() function from the scikit-learn library.
Then we create a random model. This is not a good model. There are a lot of errors. Our work is to find a model with a minimun of errors.
We create the cost function which calculates the sum of the errors of the model.
We use the gradient descent method to minimize the cost function.
Then we calculate the right model.
And this is it !
I did it for fun !