GithubHelp home page GithubHelp logo

liorshig / uq-course Goto Github PK

View Code? Open in Web Editor NEW

This project forked from predictivesciencelab/uq-course

0.0 0.0 0.0 48.93 MB

Introduction to Uncertainty Quantification

License: MIT License

Jupyter Notebook 100.00%

uq-course's Introduction

Introduction to Uncertainty Quantification

This version of the course is being taught at Purdue University during Spring 2020. The code for the course is ME 59700 and MA 59800. The instructor is Prof. Ilias Bilionis. The class meets every Tuesday and Thursday 12:00pm-1:15pm at WALC 2127.

The goal of this course is to introduce the fundamentals of uncertainty quantification to advanced undergraduates or graduate engineering and science students with research interests in the field of predictive modeling. Upon completion of this course the students should be able to:

  • Represent mathematically the uncertainty in the parameters of physical models.
  • Propagate parametric uncertainty through physical models to quantify the induced uncertainty on quantities of interest.
  • Calibrate the uncertain parameters of physical models using experimental data.
  • Combine multiple sources of information to enhance the predictive capabilities of models.
  • Pose and solve design optimization problems under uncertainty involving expensive computer simulations.

Student Evaluation

  • 10% Participation
  • 60% Homework
  • 30% Final Project

Lectures

  • Lecture 1 - Introduction, 01/14/2020.

  • Lecture 2 - Introduction to Predictive Modeling, 01/16/2020.

    • Topics: Predictive modeling, structural causal models and their graphical representation, aleatory vs epistemic uncertainties, the uncertainty propagation problem, the model calibration problem.
    • Lecture notebook
  • Lecture 3 - Introduction to Probability Theory (Part I), 01/21/2020.

    • Topics: Interpretation of probability as a representation of our state of knowledge, basic rules of probability, practice examples.
    • Lecture notebook
  • Lecture 4 - Introduction to Probability Theory (Part II), 01/23/2018.

    • Topics: Discrete random variables, probability mass function, cumulative distribution function, expectation, variance, covariance, joint probability mass function, marginals, independence, conditional probability, the Bernoulli distribution, the Binomial distribution, the categorical distribution, the Poisson distribution.
    • Lecture notebook
  • Lecture 5 - Introduction to Probability Theory (Part III), 01/28/2020.

    • Topics: Continuous random variables, the uniform distribution, the Gaussian distribution, analytical Bayesian inference examples.
    • Lecture notebook
  • Lecture 6 - Introduction to Probability Theory (Part IV), 01/30/2020.

    • Topics: Bayesian parameter estimation, credible intervals, Bayesian decision making, analytical Bayesian inference examples.
    • Lecture notebook
  • Lecture 7 - Introduction to Probability Theory (Part V), 02/04/2020.

    • Topics: Pseudo-random number generators, sampling the uniform distribution, the empirical cumulative distribution function, the Kolmogorov-Smirnov test, sampling the Bernoulli distribution, sampling any discrete distribution, limiting behavior of the binomial distribution, the central limit theorem and the ubiquitousness of the Gaussian distribution, sampling continuous distributions using inverse sampling and rejection sampling.
    • Lecture notebook
  • Lecture 8 - Uncertainty Propagation: Introduction to Monte Carlo Sampling, 02/06/2020.

    • Topics: Curse of dimensionality, estimate multi-dimensional integrals using Monte Carlo, quantification of epistemic uncertainty in Monte Carlo estimates, example of uncertainty propagation through partial differential equations.
    • Lecture notebook
  • Lecture 9 - Uncertainty Propagation: Advanced Monte Carlo Sampling, 02/11/2020.

    • Topics: Importance sampling, latin-hyper cube designs, example of uncertainty propagation through partial differential equations.
    • Lecture notebook
  • Lecture 10 - Uncertainty Propagation: Perturbation Methods, 02/13/2020.

    • Topics: Taylor series expansions; The Laplace Approximation; Low-order perturbation methods for dynamical systems; Method of adjoints.
    • Lecture notebook
  • Lecture 11 - Basics of Curve Fitting: The Generalized Linear Model, 02/18/2020.

    • Topics: Supervised learning, regression, generalized linear model, least squares, maximum likelihood.
    • Lecture notebook
  • Lecture 12 - Basics of Curve Fitting: Bayesian Linear Regression, 02/20/2020.

    • Topics: Maximum a posteriori estimates, Bayesian linear regression, evidence approximation, automatic relevance determination.
    • Lecture notebook
  • Lecture 13 - Advanced Curve Fitting: Gaussian Processes to Encode Prior Knowledge about Functions, 02/25/2020.

    • Topics: Stochastic processes, random fields, Gaussian process, mean functions, covariance functions, sampling from a Gaussian process, encoding prior knowledge about functions.
    • Lecture notebook
  • Lecture 14 - Advanced Curve Fitting: Gaussian Process Regression, 02/27/2020.

    • Topics: Conditioning Gaussian random fields on exact and noisy observations, diagnostics for curve fitting, estimating the hyperparameters of covariance functions.
    • Lecture notebook
  • Lecture 15 - Advanced Curve Fitting: Multivariate Gaussian Process Regression and Automatic Relevance Determination, 03/03/2020.

    • Topics: Multivariate Gaussian process regression, automatic relevance determination, the curse of dimensionality, active subspaces, high-dimensional model representation.
    • Lecture notebook
  • Lecture 16 - Application of Gaussian Process Regression: Optimizing expensive black-box functions, 03/05/2020.

    • Topics: Bayesian global optimization without noise, maximum upper interval, probability of improvement, expected improvement, quantifying epistemic uncertainty in the location of the maximum, Bayesian global optimization with noise.
    • Lecture notebook
  • Lecture 17 - Calibration of physical models, 03/10/2020.

  • Lecture 18 - State-space models: Kalman filters, 03/12/2020.

  • No lecture on Tuesday 03/17/2020 (spring break).

  • No lecture on Thursday 03/19/2020 (spring break).

  • Lecture 19 - Sampling from Posteriors: The Metropolis Algorithm, 03/24/2020.

  • Lecture 20 - Sampling from Posteriors: The Metropolis-Hastings algorithm, 03/26/2020.

  • Lecture 21 - Sampling from Posteriors: Gibbs Sampling, 03/31/2020.

  • Lecture 22 - Sampling from Posteriors: Sequential Monte Carlo, 04/02/2020.

  • Lecture 23 - Bayesian Model Selection, 04/07/2020.

  • Lecture 24 - Estimating Posteriors: Variational Inference, 04/09/2020.

  • Lecture 25 - Estimating Posteriors: Automatic Differentiation Variational Inference, 04/14/2020.

  • Lecture 26 - Bayesian Model Selection with Variational Inference, 04/16/2020.

  • Lecture 27 - Deep Neural Networks (Part I), 04/21/2020.

  • Lecture 28 - Deep Neural Networks (Part II), 04/23/2020.

  • Lecture 29 - Deep Neural Networks (Part III), 04/28/2020

  • Lecture 30 - Deep Neural Networks (Part IV), 04/30/2020

Homework Notebooks

Project submission timeline

  • Title and abstract, due 02/15/2020.

  • Final report, due TBD.

Running the notebooks on Google Colab

Make sure you have a Google account before you start. Ok, there are many ways you can do this. This is the simplest one:

Google Colab using directly this GitHub site

  • Go to the Google Colab website and login with your Google account (if you are not already logged in).

  • Then hit File->Open Notebook.

  • In the pop up window that opens, click on GitHub.

  • Write: https://github.com/PredictiveScienceLab/uq-course.git and hit enter.

  • Now you can select the notebook you would like to open. For example, select "lecture_01.ipynb".

  • That's it.

Google Colab using notebooks on your computer

  • First, download this repository to your computer. Use this link. Unzip the file and make sure you know where it is.

  • Go to the Google Colab website and login with your Google account (if you are not already logged in).

  • Google Colab can see your Google drive. So you should be able to open any notebook you have on your Google drive. This is one way you can do it. Drop the course directory in your Google drive. You can find these by "File->Open Notebook" and hitting the Google Drive tab.

  • The other way is to individually upload notebooks. On the Google Colab page hit File->Upload Notebook and drop the notebook you would like to open.

Installing software on Google Colab

When running on google Colab, you will have to install some software manually every time you run the notebook. For example, to install the Python module GPy, you need to add a code block:

!pip install GPy

Running the notebooks on your personal computer

Find and download the right version of Anaconda for Python 3.7 from Continuum Analytics. This package contains most of the software we are going to need. Note: You do need Python 3 and note Python 2. The notebooks will not work with Python 2.

OS Specific Instructions

Microsoft Windows

  • We need C, C++, Fortran compilers, as well as the Python sources. Start the command line by opening "Anaconda Prompt" from the start menu. In the command line type:
conda config --append channels https://repo.continuum.io/pkgs/free
conda install mingw libpython
  • Finally, you need git. As you install it, make sure to indicate that you want to use "Git from the command line and also from 3rd party software".

Apple OS X

  • Download and install the latest version of Xcode.

Linux

If you are using Linux, I am sure that you can figure it out on your own.

Installation of Required Python Packages

Independently of the operating system, use the command line to install the following Python packages:

conda install seaborn
  • PyMC3 for MCMC sampling:
conda install pymc3
  • GPy for Gaussian process regression:
pip install GPy
  • pydoe for generating experimental designs:
pip install pydoe
  • fipy for solving partial differential equations using the finite volume method:
pip install fipy

*** Windows Users ***

You may receive the error

ModuleNotFoundError: No module named 'future'

If so, please install future and then install fipy:

pip install future
  • scikit-learn for some standard machine learning algorithms implemented in Python:
conda install scikit-learn
  • graphviz for visualizing probabilistic graphical models:
pip install graphviz

Running the notebooks

  • Open the command line.
  • cd to your favorite folder.
  • Then, type:
git clone https://github.com/PredictiveScienceLab/uq-course.git
  • This will download the contents of this repository in a folder called uq-course.
  • Enter the uq-course folder:
cd uq-course
  • Start the jupyter notebook by typing the command:
jupyter notebook
  • Use the browser to navigate the course, experiment with code etc.
  • If the course content has been updated, type the following command (while being inside uq-course) to get the latest version:
git pull origin master

Keep in mind, that if you have made local changes to the repository, you may have to commit them before moving on.

uq-course's People

Contributors

ebilionis avatar lundalana avatar murakrishn avatar rohitkt10 avatar sharmila-k avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.