GithubHelp home page GithubHelp logo

pnkraemer / probfindiff Goto Github PK

View Code? Open in Web Editor NEW
10.0 1.0 2.0 334 KB

Probabilistic numerical finite differences. Compute finite difference weights and differentiation matrices on scattered data sites and with out-of-the-box uncertainty quantification.

Home Page: https://probfindiff.readthedocs.io

License: MIT License

Python 100.00%
scientific-computing finite-difference-methods partial-differential-equations probabilistic-numerics

probfindiff's People

Contributors

girijakar avatar pnkraemer avatar schmidtjonathan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

probfindiff's Issues

Partial derivative tutorial

The "Partial derivatives" tutorial shows how to compute partial derivatives, not just at a single point but at many points at once. This is not really what it should be.

`grid` or `xs`?

The method is called from_grid, but the grid-variable is usually referred to as xs. While this may be okay for tutorial notebooks, it is a bit inconsistent as part of the API.

Easier access to custom kernels

Currently, it is possible to pass custom kernels to the functions, but if a user has access to efficient derivatives, that is hard make good use of.

1d coefficient API

The methods in coefficients_1d receive "x", but no neighbours. Instead, they assemble neighbours via dx and the specified order.
In light of this, they should not require "x" to be parsed. it is meaningless.

Public API

Let's work on the clarity and usability of the API. For example:

# ---------------------------------------------------
# Top-level interface, which mirrors jax' autodiff, and e.g. np.gradient(), np.diff().
# Essentially, functions in here only chain coefficients.apply() with coefficients.jac()
# ---------------------------------------------------
pnfindiff.gradient(fx, axis, k=exp_quad(), acc=2)
pnfindiff.jac(fx, axis, acc)
pnfindiff.laplace(fx axis, acc)
# (...)

# ---------------------------------------------------
# Implementation of common derivatives.
# Every grad-and-vmap-ready kernel 'k' can be used. For example, those in the utils
# Points to `collocation`
# ---------------------------------------------------
dfx, base_unc = pnfindiff.coefficients.apply(f, weights=(w, sc), indices=idcs)
dfx, base_unc = pnfindiff.coefficients.apply_along_axis(f, weights=(w, sc), axis=1)
(w, sc), idcs, info = pnfindiff.coefficients.derivative(*, xs, k, acc=2)
(w, sc), idcs, info = pnfindiff.coefficients.derivative_higher(*, order, xs, k=None ,acc=2)
(w, sc), idcs, info = pnfindiff.coefficients.gradient(xs, k, acc=2)
(w, sc), idcs, info = pnfindiff.coefficients.laplace(xs, k,  acc=2)
(w, sc), idcs, info = pnfindiff.coefficients.divergence(xs, k,  acc=2)
(w, sc), idcs, info = pnfindiff.coefficients.jac(xs, k, acc=2)
(w, sc), idcs, info = pnfindiff.coefficients.approx(L, xs, k, *, order=2, acc=2)  # the magic stuff. maybe move to collocation?


# ---------------------------------------------------
# Calibration API
# ---------------------------------------------------
output_scale = pnfindiff.stats.mle_output_scale(f, weights=(w, sc), info=info)
logpdf = pnfindiff.stats.logpdf(f, weights=(w, sc), info=info)

# ---------------------------------------------------
# API for 1d coefficients. 
# This is like in FinDiff, but for PNFinDiff, 
# I suppose such a module has only didactic purposes.
# Stack copies of coefficients for apply()-conformity.
# Points to `collocation`
# ---------------------------------------------------
(w, sc), idx, info = pnfindiff.coefficients_1d.forward(dx, deriv, acc)
(w, sc), idx, info = pnfindiff.coefficients_1d.center()
(w, sc), idx, info = pnfindiff.coefficients_1d.backward()
(w, sc), idx, info = pnfindiff.coefficients_1d.from_offset()

# ---------------------------------------------------
# Collocation implementations. This is where the magic happens, but only few will look at it.
# ---------------------------------------------------
pnfindiff.collocation.prepare_gram(x, xs, ks)
pnfindiff.collocation.unsymmetric(K, LK, LLK )

# ---------------------------------------------------
# Simplify our life via AD utility functions
# ---------------------------------------------------
pnfindiff.utils.autodiff.deriv_scalar(fun, **kwargs)
pnfindiff.utils.autodiff.div(fun, **kwargs)
pnfindiff.utils.autodiff.laplace(fun, **kwargs)
pnfindiff.utils.autodiff.compose(L1, L2)

# ---------------------------------------------------
# ... and kernel utility functions
# ---------------------------------------------------
pnfindiff.utils.kernel.batch_gram(k)
k, lk, llk = pnfindiff.utils.kernel.differentiate(k)
pnfindiff.utils.kernel.zoo.polynomial()
k, lk, llk = pnfindiff.utils.kernel.zoo.exp_quad()

Vector-value differential operators

It would be quite incredible if one could make the finite difference coefficients handle vector-valued functions/operators well. For example, the Jacobian.

One difficulty will be that the number of dimensions could become quite awkward...

Polynomial kernels

It would be useful to have polynomial kernels. For example, to sanity-check that familiar coefficients are returned. But also for other reasons.

Calibration

It should be made really easy to calibrate the uncertainties. At the moment, this is a bit cumbersome.

Numerical noise

It would be good to have an interface how to compute finite difference samples base on noisy observations of a function

Desired features

All of the following should be possible with pnfindiff at some point of the future, ideally, documented in example notebooks.

Basic scalar derivatives:

Evaluate the numerical derivative f'(x_0) of a function f: R -> R at point x_0

  • Central, forward, backward schemes
  • Variable derivative order
  • Variable method order
  • Numerical noise
  • Custom grid
  • Batching over x_0
  • Batching over x_0 with a non-uniform grid
  • Batching over f
  • Polynomial vs exponentiated quadratic kernel

Basic multivariate derivatives

Evaluate the numerical derivative (D f) (x_0) of a function f: R^n -> R^m at point x_0

  • Partial derivatives via schemes
  • Gradients via a tensor-product approach and using all of the above
  • Jacobians and Hessians via a tensor-product approach and using all of the above
  • Gradients, via an n-dimensional prior
  • (Mixed) partial derivatives for n-dimensional priors
  • Jacobians (forward and reverse mode?), Hessians, (JVPs?) via a n-dimensional prior

Advanced features

  • Uncertainty calibration
  • Control over stencils -> which point is on boundary, etc. (https://github.com/maroba/findiff)
  • Computation of (L f)(Xn) with FD. This is not the same as batched evaluation, because for instance, boundary nodes may use forward schemes, and central nodes use central schemes. (https://numpy.org/doc/stable/reference/generated/numpy.gradient.html)
  • Support for symbolic computation
  • Support for sparse matrices
  • A bunch of kernels: matern, (inverse) multiquadrics, thin-plate splines
  • Boundary conditions?
  • Advanced differential operators: Divergence, Laplacian, curl
  • Polar coordinates
  • complex differentials? integer-valued derivatives? (e.g. grad(f, allow_int=True, holomorphic=True))
  • Unsymmetric and symmetric collocation (with examples and docs)

TOX + sphinx-autodoc-typehint

To make tox compile the docs appropriately, I needed to repeat the sphinx-autodoc-typehint requirement in the tox.ini. This is not optimal and should be resolved in the future.

Readme typos

The readme sometimes talks about pnfindiff where it should reference probfindiff. This needs to be fixed.

Workflows

Currently, the CI is run in the workflow via tox.
It would be useful to run the different actions separately, so the feedback on which part is broken becomes available earlier.

Boundary conditions

It would be great to have a neat way of handling boundary conditions. I am not 100% sure what the API should be.

Coefficient variable

At the moment, both coeffs and coefficients are used throughout the repo. This seems a bit inconsistent.

Rename package?

The findiff part is nice and clear, but the pn bit will be a little hard to remember for the uninitiated.

Alternatives could be

  • probnum_findiff (Second place, and might have been the winner if underscored in package names would not be discouraged)
  • probfindiff (Winner, because it is quite easy to remember. The "num" is implied in "findiff" :))
  • fdx
  • findiffpy
  • numdiff
  • pfindiff
  • probnumfindiff (Third place, lost bc too long. the "num" puts it in length significantly ahead of established packages like setuptools or sqlalchemy)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.