GithubHelp home page GithubHelp logo

theogf / bayesianquadrature.jl Goto Github PK

View Code? Open in Web Editor NEW
13.0 5.0 0.0 347 KB

Is there anything we can't make Bayesian?

License: MIT License

Julia 100.00%
bayesian bayesian-inference quadrature bayesian-quadrature julia integral gaussian

bayesianquadrature.jl's People

Contributors

github-actions[bot] avatar johannesgiersdorf avatar theogf avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

bayesianquadrature.jl's Issues

Error small Variance

The estimated variance is sometimes to small, so the Normal distribution can't be constructed.

Error message

ERROR: LoadError: ArgumentError: Normal: the condition σ >= zero(σ) is not satisfied.
Stacktrace:
 [1] macro expansion
   @ ~/.julia/packages/Distributions/cNe2C/src/utils.jl:6 [inlined]
 [2] #Normal#112
   @ ~/.julia/packages/Distributions/cNe2C/src/univariate/continuous/normal.jl:37 [inlined]
 [3] Normal::Float64, σ::Float64)
   @ Distributions ~/.julia/packages/Distributions/cNe2C/src/univariate/continuous/normal.jl:37
 [4] quadrature(bquad::BayesQuad{SqExponentialKernel, Float64, Float64}, model::BayesModel{ZeroMeanDiagNormal{Tuple{Base.OneTo{Int64}}}, typeof(log_f)}, samples::Vector{Vector{Float64}})
   @ BayesianQuadrature ~/.julia/packages/BayesianQuadrature/eNnqB/src/bayesquads/bayesquad.jl:58
 [5] (::BayesQuad{SqExponentialKernel, Float64, Float64})(rng::MersenneTwister, model::BayesModel{ZeroMeanDiagNormal{Tuple{Base.OneTo{Int64}}}, typeof(log_f)}, sampler::PriorSampling; x_init::Vector{Any}, nsamples::Int64, callback::Nothing)
   @ BayesianQuadrature ~/.julia/packages/BayesianQuadrature/eNnqB/src/interface.jl:10
 [6] top-level scope
   @ ~/Documents/uni/master/master-thesis-code-base/src/min_example_normal.jl:13
 [7] include(fname::String)
   @ Base.MainInclude ./client.jl:444
 [8] top-level scope
   @ REPL[8]:1

Example

using BayesianQuadrature
using Distributions
using KernelFunctions
using Random

rng = Random.MersenneTwister(42)

p_0 = MvNormal(ones(2)) # As for now the prior must be a MvNormal
log_f(x) = logpdf(MvNormal(0.5 * ones(2)), x) 
m = BayesModel(p_0, log_f) 
bquad = BayesQuad(SEKernel(); l=10.0, σ=1.0) 
sampler = PriorSampling() 
p_I, _ = bquad(rng, m, sampler; nsamples=200) # Returns a Normal distribution
@show p_I 

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

Rename BMC

I would suggest to rename BMC.
Bayesian Monte Carlo, the simplest approach for Bayesian Quadrature. Assumes that the prior is Gaussian and that the integrand is as well
https://github.com/theogf/BayesianQuadrature.jl/blob/d621e84566cbcafa7cbe13a6036873479b19471d/src/integrators/bmc.jl#LL1-L7

It is true that it is motivated by the example from the paper and it should be referenced (I suggest in the readme). It is misleading to use the name BMC in the quadrature part (BMC refers more on how to sample the states and than use BQ).

I would refer instead to the distribution (or even kernel-distribution pair), where the distribution is the one we are importance re-weighting "against" (e.g. Gaussian in the simple case (SE kernel - Gaussian)).

See also 4.2 Tractable and Intractable Kernel Means
Briol, F. X., Oates, C. J., Girolami, M., Osborne, M. A., & Sejdinovic, D. (2019). Probabilistic integration: A role in statistical computation?. Statistical Science, 34(1), 1-22.

Remark: The idea is having kernel-distribution pairs with a closed form kernel mean.
the case (SE kernel - Gaussian) is called Bayes–Hermite quadrature
O’Hagan, A. (1991). Bayes–Hermite quadrature.J. Statist. Plann. Inference, 29:245–260.

API structure

So Bayesian quadrature is quite unlike other quadrature packages, therefore we need to think how to approach it.

On top of the likelihood/prior input, there are two important playing parameters:

  • The method to select new samples
  • The method to evaluate the different integrals (vanilla or with transformation)

I think for simplicity it is better to pass these two separately as there is no real benefit to embed the first one in the second.

Implement the right importance sampling for non-Gaussian prior

We should have an automatic fallback for when the prior is not Gaussian:
f(x) * p(x) -> (f(x) * p(x)) / q(x) * q(x) where q(x) is MvNormal
We can of course have the default q(x) = N(0, I) but there are easy heuristics to get a better proposal given p(x)

Hot start

We should allow the possibility to give a series of evaluated samples (x, y) before even sampling.

Callback function argument

I feel a callback function would be nice to have. I think we can simply do it by passing it to the AbstractMCMC.sample method. I will make a PR for it.

Storing samples

How should samples be stored? Since we need to go iteratively over them and do not necessarily know their type in advance it is not possible to allow a type stable storage in either the sampler or the integrator.
One package to look at for inspiration is https://github.com/TuringLang/AbstractMCMC.jl

log integrand in BayesModel

Using the logarithm of the integrand in the BayesModel provides a more stable way of computing functions regarding the integrand (e.g. logjoint)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.