GithubHelp home page GithubHelp logo

seekzzh / math50003numericalanalysis2122 Goto Github PK

View Code? Open in Web Editor NEW

This project forked from imperial-math50003/math50003numericalanalysis2021-22

0.0 0.0 0.0 33.36 MB

Notes and course material for MATH50003 Numerical Analysis (2021–2022)

License: MIT License

Julia 0.01% HTML 24.94% Jupyter Notebook 75.05%

math50003numericalanalysis2122's Introduction

MATH50003NumericalAnalysis (2021–2022)

Notes and course material for MATH50003 Numerical Analysis (2021–2022)

Lecturer: Dr Sheehan Olver

Problem Classes: 2–4pm Thursdays, Huxley 340–342

Overview lecture: 10–11am Fridays, Clore

Q&A: 9:50–10:30 Mondays, 10:40–11:20 Tuesdays on Teams

Course notes

Background material

  1. Introduction to Julia: we introduce the basic features of the Julia language.
  2. Asymptotics and Computational Cost: we review Big-O, little-o and asymptotic to notation, and their usage in describing computational cost.

Part I: Computing with numbers

  1. Numbers: we discuss how computers represent integers and real numbers, as well as their arithmetic operations.
  2. Differentiation: we discuss ways of approximating derivatives, including automatic differentiation, which is essential to machine learning.

Part II: Computing with matrices

  1. Structured Matrices: we discuss types of structured matrices (permutations, orthogonal matrices, triangular, banded).
  2. Decompositions: we discuss algorithms for computing matrix decompositions (QR and PLU decompositions) and their use in solving linear systems.
  3. Singular values and condition numbers: we discuss vector and matrix norms, and condition numbers for matrices, and the singular value decomposition.
  4. Differential Equations: we discuss the numerical solution of linear differential equations, including both time-dependent ordinary differential equations and boundary value problems, by reduction to linear systems.

Part III: Computing with functions

  1. Fourier series: we discuss Fourier series and their usage in numerical computations via the fast Fourier transform.
  2. Orthogonal Polynomials: we discuss orthogonal polynomials—polynomials orthogonal with respect to a prescribed weight.
  3. Interpolation and Gaussian quadrature: we discuss polynomial interpolation, Gaussian quadrature, and expansions in orthogonal polynomials.

Assessment

  1. Practice computer-based exam (Solutions)
  2. Computer-based exam (Solutions)
  3. Practice final exam (Solutions)
  4. Final exam (pen-and-paper): 9 May 2022, 9:00–12:00

Problem sheets

  1. Week 1 (Solutions): Binary representation, integers, floating point numbers, and interval arithmetic
  2. Week 2 (Solutions): Finite-differences, dual numbers, and Newton iteration
  3. Week 3 (Solutions): dense, triangular, banded, permutation, rotation and reflection matrices
  4. Week 4 (Solutions): least squares, QR and PLU decompositions
  5. Week 5 (Solutions): positive definite matrices, Cholesky, and SVD
  6. Week 6 (Solutions): Condition numbers, indefinite integration, and Euler methods
  7. Week 7 (Solutions): Two point boundary value problems, convergence, and Fourier series
  8. Week 8 (Solutions): Discrete Fourier Transform (DFT), Fast Fourier Transform (FFT), and Orthogonal Polynomials.
  9. Week 9 (Solutions): Jacobi matrices, classical orthogonal polynomials, and interpolation.
  10. Week 10 (Solutions): Orthogonal polynomial roots, interpolatory quadrature, and Gaussian quadrature.

Additional problem sheets:

  1. Julia Problem Sheet

Reading List

  1. Nicholas J. Higham, Accuracy and Stability of Numerical Algorithms, Chapters 1–3
  2. Michael L. Overton, Numerical Computing with IEEE Floating Point Arithmetic, Chapters 2–6
  3. Lloyd N. Trefethen & David Bau III, Numerical Linear Algebra, Chapters 1–4
  4. Lloyd N. Trefethen, Approximation Theory and Approximation Practice, Chapters 1–4, 17–19
  5. The Julia Documentation
  6. The Julia–Matlab–Python Cheatsheet

Notes from lectures

  1. Integers
  2. Floating Point Numbers
  3. Floating Point Arithmetic
  4. Bounding Errors in Rounding
  5. Finite Differences
  6. Dual Numbers
  7. Structured Matrices
  8. Permutations
  9. Orthogonal Matrices
  10. Least Squares
  11. Gram Schmidt and Reduced QR
  12. Householder and QR
  13. PLU Decomposition
  14. Matrix Norms
  15. Singular Value Decomposition
  16. Cholesky
  17. Condition Numbers
  18. Indefinite Integration
  19. Euler Method
  20. Poisson Equation
  21. Convergence of Euler Methods
  22. Fourier Series
  23. Trapezium Rule and Fourier Coefficients
  24. Discrete Fourier Transform
  25. Fast Fourier Transform
  26. Jacobi Matrices
  27. Classical Orthogonal Polynomials
  28. Polynomial Interpolation
  29. Orthogonal Polynomial Roots
  30. Interpolatory Quadrature
  31. Gaussian Quadrature

What is numerical analysis?

Broadly speaking, numerical analysis is the study of approximating solutions to continuous problems in mathematics, for example, integration, differentiation, and solving differential equations. There are three key topics in numerical analysis:

  1. Design of algorithms: discuss the construction of algorithms and their implmentation in software.
  2. Convergence of algorithms: proving under which conditions algorithms converge to the true solution, and the rate of convergence.
  3. Stability of algorithms: the study of robustness of algorithms to small perturbations in data, for example, those that arise from measurement error, errors if data are themselves computed using algorithms, or simply errors arising from the way computers represent real numbers.

The modern world is built on numerical algorithms:

  1. Fast Fourier Transform (FFT): Gives a highly efficient way of computing Fourier coefficients from function samples, and is used in many places, e.g., the mp3 format for compressing audio and JPEG image format. (Though, in a bizarre twist, GIF, a completely uncompressed format, has made a remarkable comeback.)
  2. Singular Value Decomposition (SVD): Allows for approximating matrices by those with low rank. This is related to the PageRank algorithm underlying Google.
  3. Stochastic Gradient Descent (SGD): Minima of high-dimensional functions can be effectively computed using gradients in a randomised algorithm. This is used in the training of machine learning algorithms.
  4. Finite element method (FEM): used to solve many partial differential equations including in aerodynamics and weather prediction. Firedrake is a major project based at Imperial that utilises finite element method.

This is not to say that numerical analysis is only important in applied mathematics. It is playing an increasing important role in pure mathematics with important proofs based on numerical computations:

  1. The Kepler conjecture. This 400 year conjecture on optimal sphere packing was finally proved in 2005 by Thomas Hales using numerical linear programming.
  2. Numerical verification of the Riemann Hypothesis. It has been proved that there are no zeros of the Riemann zeta function off the critical line provided the imaginary part is less than 30,610,046,000.
  3. Steve Smale's 14th problem on the stability of the Lorenz system was solved using interval arithmetic.

Note these proofs are rigorous: as we shall see it is possible to obtain precise error bounds in numerical calculations, and the computations themselves can be computer-verified (a la The Lean Theorem Prover). As computer-verified proofs become increasingly important, the role of numerical analysis in pure mathematics will also increase, as it provides the theory for rigorously controlling errors in computations. Note that there are many computer-assisted proofs that do not fall under numerical analysis because they do not involve errors in computations or approximation or involve discrete problems, for example, the proof the Four Colour Theorem.

The Julia Programming Language

In this course we will use the programming language Julia. This is a modern, compiled, high-level, open-source language developed at MIT. It is becoming increasingly important in high-performance computing and AI, including by Astrazeneca, Moderna and Pfizer in drug development and clinical trial accelleration, IBM for medical diagnosis, MIT for robot locomotion, and elsewhere.

It is ideal for a course on numerical analysis because it both allows fast implementation of algorithms but also has support for fast automatic-differentiation, a feature that is of increasing importance in machine learning. It also is low level enough that we can really understand what the computer is doing. As a bonus, it is easy-to-read and fun to write.

To run Julia in a Jupyter notebook on your own machine:

  1. Download Julia v1.7.1
  2. Open the Julia app which will launch a new window
  3. Install the needed packages by typing (] will change the prompt to a package manager):
] add IJulia Plots ColorBitstring SetRounding
  1. Build Jupyter via
] bulid IJulia
  1. Launch Jupyter by typing
using IJulia
notebook()

math50003numericalanalysis2122's People

Contributors

dlfivefifty avatar tsgut avatar putianyi889 avatar lukemmtravis avatar almadam avatar isaacjeffersonlee avatar argoopjmc avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.