GithubHelp home page GithubHelp logo

d-t-n / hands-on-ml Goto Github PK

View Code? Open in Web Editor NEW
0.0 2.0 0.0 36.49 MB

Example code and solutions to the exercises in my O'Reilly book Hands-on Machine Learning with Scikit-Learn and TensorFlow

Jupyter Notebook 100.00%
applied-data-science applied-machine-learning deep-learning deep-neural-networks machine-learning machine-learning-algorithms mlp python random-forest scikit-learn

hands-on-ml's Introduction

Hands-on ML

Example code and solutions to the exercises in my O'Reilly book Hands-on Machine Learning with Scikit-Learn and TensorFlow.

Book Content Index

Part I. The Fundamentals of Machine Learning

  1. The Machine Learning Landscape

    • What Is Machine Learning?
    • Why Use Machine Learning?
    • Types of Machine Learning Systems
    • Supervised/Unsupervised Learning
    • Batch and Online Learning
    • Instance-Based Versus Model-Based Learning
    • Main Challenges of Machine Learning
    • Insufficient Quantity of Training Data
    • Nonrepresentative Training Data
    • Poor-Quality Data
    • Irrelevant Features
    • Overfitting the Training Data
    • Underfitting the Training Data
    • Stepping Back
    • Testing and Validating
    • Hyperparameter Tuning and Model Selection
    • Data Mismatch

    Notebook: - The Machine Learning Landscape




2. End-to-end Machine Learnng Project: * Look at the big picture - Get the data - Discover and visualize the data to gain insights - Prepare the data for Machine Learning algorithms - Select a model and train it - Fine-tune your model - Present your solution - Launch, monitor, and maintain your system
Notebook:
    - [End-to-end Machine Learnng Project](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/02-End_to_End_Machine_Learning_Project-.ipynb)



3. Classification * MNIST - Training a Binary Classifier - Performance Measures - Multiclass Classification - Error Analysis - Multilabel Classification - Multioutput Classification
Notebook:
    - [Classification](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/03-Classification.ipynb)



4. Training Models - Linear Regression - Gradient Descent - Polynomial Regression - Learning Curves - Regularized Linear Models - Logistic Regression
Notebook:
    - [Training Models](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/04-Training_Models.ipynb)



5. Support Vector Machines - Linear SVM Classification - Nonlinear SVM Classification - SVM Regression - Under the Hood
Notebook:
    - [Support Vector Machines](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/05-Support_Vector_Machines.ipynb)



6. Decision Trees - Training and Visualizing a Decision Tree - Making Predictions - Estimating Class Probabilities - The CART Training Algorithm - Computational Complexity - Gini Impurity or Entropy? - Regularization - Regression - Instability
Notebook:
    - [Decision Trees](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/06-Decision_Trees.ipynb)



7. Ensemble Learning and Random Forests - Voting Classifiers - Bagging and Pasting - Random Patches and Random Subspaces - Random Forests - Boosting - Stacking
Notebook:
    - [Ensemble Learning and Random Forests](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/07-Ensemble_Models.ipynb)



8. Dimensionality Reduction - The Curse of Dimensionality
Notebook:
    - [Dimensionality Reduction](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/08-Dimensionality_Reduction.ipynb)



9. Unsupervised Learning Techniques - Clustering - Gaussian Mixtures
Notebook:
    - [Unsupervised Learning Techniques](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/09-Unsupervised_Learning_Techniques.ipynb)



10. Introduction to Artificial Neural Networks with Keras - From Biological to Artificial Neurons - Implementing MLPs with Keras - Fine-Tuning Neural Network Hyperparameters
Notebook:
    - [Unsupervised Learning Techniques](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/10-Introduction_to_Artificial_Neural_Networks.ipynb)



11. Training Deep Neural Networks - The Vanishing/Exploding Gradients Problems - Reusing Pretrained Layers - Faster Optimizers - Avoiding Overfitting Through Regularization
Notebook:
    - [Training Deep Neural Networks](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/11-Training_Deep_Neural_Networks.ipynb)



12. Custom Models and Training with TensorFlow - A quick tour of TensorFlow - Using TensorFlow like Numpy - Customizing Models and Training Algorithms - TensorFlow Functions and Graphs
Notebook:
    - [Custom Models and Training with TensorFlow](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/12-Custom_Models_and_TF.ipynb)



13. Loading and Prepocessing Data with TensorFlow - The Data API - The TFRcord Format - Preprocessing the TF Transform - TensorFlow Dataset (TFDS) Project
Notebook:
    - [Loading and Prepocessing Data with TensorFlow](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/13-Loading_and_Prepocessing_Data_with_TF.ipynb)



14. Deep Computer Vision Using Convolutional Neural Networks - The Architecture of the Virtual Cortex - Convolutional Layers - Pooling Layers - CNN Arcitectures - Implementing a ResNet-34 CNN Using Keras - Using Pretrained Models from Keras - Pretrained Models for Transfer Learning - Classification and Localization - Object Detection - Semantic Segmentation
Notebook:
    - [Deep Computer Vision Using Convolutional Neural Networks](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/14_Deep_Computer_Vision_Using_Convolutional_Neural_Networks.ipynb)



15. Processing Sequences Using RNNs and CNNs - Recurrent Neurons and Layers - Training RNNs - Forecasting a Time Series - Handling Long Sequences
Notebook:
    - [Processing Sequences Using RNNs and CNNs](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/15_Processing_Sequences_Using_RNNs_and_CNNs.ipynb)



16. Natural Language Processing with RNNs and Attention - Generating Shakespearean Text Using a Character RNN - Sentiment Analysis - An Encoder–Decoder Network for Neural Machine Translation - Attention Mechanisms - Recent Innovations in Language Models
Notebook:
    - [Natural Language Processing with RNNs and Attention](https://nbviewer.jupyter.org/github/d-t-n/d-t-n/blob/master/16_Natural_Language_Processing_with_RNNs_and_Attention.ipynb)

hands-on-ml's People

Contributors

d-t-n avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.