View Code? Open in Web Editor
NEW
This project forked from antonioerdeljac /python-tensorflow-google
Notes taken from Google Machine Learning Course provided to public for practice & correction.
google-machine-learning-course-notes's Introduction
Google Machine Learning Course Notes
Wiki
Framing
In this section we learn the basics of Machine Learning Terminology
Descending into Machine Learning
In this section we work with linear regression, learn about MSE, loss caculation and the basics of how training a model works
Reducing Loss
In this section we explore loss reduction methods by explaining gradient descent, batches, iterative learning and other effective learning methods
First Steps With Tensorflow
In this section we learn the basics of TensorFlow and Pandas. Through practices linked we develop our own linear regression code
Generalization
In this article we discuss the problem of overfitting, learn the difference between a good and a bad model, learn about subsets used in model training & generalization
Training and test sets
In this section we learn about data splitting, dangers of training on test data & test data characteristics
Validation
In this section we cover the importance of validation, a 3rd partition in a dataset
Representation
In this section we discuss qualities of features, learn about feature engineering and mapping values to useful features
Feature Crosses
In this section we look into feature crosses, a synthetic feature used to improve model's learning & encode non-linear data into useful features
Regularization: Simplicity
In this section we look into ways of penalizing the model for being too complex using L2 regularization
Logistic Regression
In this section we look into Logistic Regression to calculate probabilty, and dive deeper into it's loss function
Classification
In this section we dive into evaluation precision and recall of logistic regression, as well as ROC & AUCs curves
Regularization: Sparsity
In this section we learn the differences between L1 & L2 and how they bring uninformative weights to 0 or close to 0
Neural Networks
In this section we learn how to solve non-linear problems with Neural Networks. We dive into basics of Neural Networks structure & how it all works
Training Neural Networks
In this article we dive into backpropagation, an algorithm used to traing Neural Networks
Multi Class Neural Networks
In this article we look into multi class neural networks which are the closest to real world example of machine learning usage such as recognizing cars, faces, poses etc.
Embeddings
Learn about embeddings & how they are used to translate large sparse-vectors to a lower dimensional space
google-machine-learning-course-notes's People
Contributors
Watchers