GithubHelp home page GithubHelp logo

varun16dec / coursera-uw-machine-learning-regression Goto Github PK

View Code? Open in Web Editor NEW

This project forked from ssq/coursera-uw-machine-learning-regression

0.0 2.0 0.0 57.37 MB

For quick search

License: MIT License

HTML 69.93% Jupyter Notebook 29.79% Python 0.28%

coursera-uw-machine-learning-regression's Introduction

Coursera UW Machine Learning: Regression

Course can be found here

Notebook for quick search can be found here

  • Week 1: Simple Linear Regression:

    • Describe the input (features) and output (real-valued predictions) of a regression model
    • Calculate a goodness-of-fit metric (e.g., RSS)
    • Estimate model parameters to minimize RSS using gradient descent
    • Interpret estimated model parameters
    • Exploit the estimated model to form predictions
    • Discuss the possible influence of high leverage points
    • Describe intuitively how fitted line might change when assuming different goodness-of-fit metrics
    • Fitting a simple linear regression model on housing data
  • Week 2: Multiple Regression: Linear regression with multiple features

    • Describe polynomial regression
    • Detrend a time series using trend and seasonal components
    • Write a regression model using multiple inputs or features thereof
    • Cast both polynomial regression and regression with multiple inputs as regression with multiple features
    • Calculate a goodness-of-fit metric (e.g., RSS)
    • Estimate model parameters of a general multiple regression model to minimize RSS:
      • In closed form
      • Using an iterative gradient descent algorithm
    • Interpret the coefficients of a non-featurized multiple regression fit
    • Exploit the estimated model to form predictions
    • Explain applications of multiple regression beyond house price modeling
    • Exploring different multiple regression models for house price prediction
    • Implementing gradient descent for multiple regression
  • Week 3: Assessing Performance

    • Describe what a loss function is and give examples
    • Contrast training, generalization, and test error
    • Compute training and test error given a loss function
    • Discuss issue of assessing performance on training set
    • Describe tradeoffs in forming training/test splits
    • List and interpret the 3 sources of avg. prediction error
      • Irreducible error, bias, and variance
    • Discuss issue of selecting model complexity on test data and then using test error to assess generalization error
    • Motivate use of a validation set for selecting tuning parameters (e.g., model complexity)
    • Describe overall regression workflow
    • Exploring the bias-variance tradeoff
  • Week 4: Ridge Regression

    • Describe what happens to magnitude of estimated coefficients when model is overfit
    • Motivate form of ridge regression cost function
    • Describe what happens to estimated coefficients of ridge regression as tuning parameter λ is varied
    • Interpret coefficient path plot
    • Estimate ridge regression parameters:
      • In closed form
      • Using an iterative gradient descent algorithm
    • Implement K-fold cross validation to select the ridge regression tuning parameter λ
    • Observing effects of L2 penalty in polynomial regression
    • Implementing ridge regression via gradient descent
  • Week 5: Lasso Regression: Regularization for feature selection

    • Perform feature selection using “all subsets” and “forward stepwise” algorithms
    • Analyze computational costs of these algorithms
    • Contrast greedy and optimal algorithms
    • Formulate lasso objective
    • Describe what happens to estimated lasso coefficients as tuning parameter λ is varied
    • Interpret lasso coefficient path plot
    • Contrast ridge and lasso regression
    • Describe geometrically why L1 penalty leads to sparsity
    • Estimate lasso regression parameters using an iterative coordinate descent algorithm
    • Implement K-fold cross validation to select lasso tuning parameter λ
    • Using LASSO to select features
    • Implementing LASSO using coordinate descent
  • Week 6: Going nonparametric: Nearest neighbor and kernel regression

    • Motivate the use of nearest neighbor (NN) regression
    • Define distance metrics in 1D and multiple dimensions
    • Perform NN and k-NN regression
    • Analyze computational costs of these algorithms
    • Discuss sensitivity of NN to lack of data, dimensionality, and noise
    • Perform weighted k-NN and define weights using a kernel
    • Define and implement kernel regression
    • Describe the effect of varying the kernel bandwidth λ or # of nearest neighbors k
    • Select λ or k using cross validation
    • Compare and contrast kernel regression with a global average fit
    • Define what makes an approach nonparametric and why NN and kernel regression are considered nonparametric methods
    • Analyze the limiting behavior of NN regression
    • Use NN for classification
    • Predicting house prices using k-nearest neighbors regression

coursera-uw-machine-learning-regression's People

Contributors

ssq avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.