GithubHelp home page GithubHelp logo

kubapod / bayesianinference Goto Github PK

View Code? Open in Web Editor NEW

This project forked from ssmit1986/bayesianinference

1.0 1.0 0.0 10.3 MB

Wolfram Language application for Bayesian inference and Gaussian process regression

License: MIT License

Mathematica 100.00%

bayesianinference's Introduction

BayesianInference

Wolfram Language application for Bayesian inference and Gaussian process regression.

This is a package that implements the nested sampling algorithm in pretty much the same way as described by John Skilling in his 2006 paper "Nested Sampling for General Bayesian Computation" (Bayesian Analysis, 1, number 4, pp. 833 - 860; available at: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.117.5542&rep=rep1&type=pdf).

It also provides some functionality for Markov Chain Monte Carlo sampling (MCMC) based on built-in (but undocumented) functions in the Statistics`MCMC` context.

A new recently added function called BayesianLinearRegression provides the Bayesian alternative to Mathematica's LinearModelFit.

Finally, there is also some code that helps to construct neural networks for quasi-Bayesian regression as explained on the following pages:

Installation:

  • Open your user base directory (e.g., evaluate SystemOpen[$UserBaseDirectory] in a notebook)
  • Go into the Applications directory
  • Drop the whole BayesianInference directory in the Applications directory (i.e., the folder structure should be $UserBaseDirectory/Applications/BayesianInference)
  • Restart Mathematica

You can now load the package by evaluating:

<< BayesianInference`

Alternatively, you can just set the working directory of your notebook to the same directory as example_code.nb and load the package using the line above (see the initialisation cell in the notebook for an example).

Using the package:

See the example_code.nb notebook for a general explanation of the code and several examples. Note that the package underwent significant changes since the previous release and most functionality is invoked in a different way than before.

Update history

Version 2.0

  • 23 November 2018

    • The package has been overhauled significantly and now relies largely on the MCMC sampler hidden in the Statistics`MCMC` context. This means that this version may not be compatible with older versions of Mathematica, so please check out the release1 tag of this package if you prefer/need the old code. I will probably continue to find small bugs to fix and improvements to make in the near future, so there will most likely be more updates to come.
  • 24 November 2018

    • Add an example section that shows how to use nested sampling for time series regression with a GeometricBrownianMotionProcess.
    • Improve the numerical stability of the evidence computation.
  • 25 November 2018

    • Expand the section on time series process regression. Now contains explanation of how to compile the loglikelihood function of a geometric Brownian motion process.
  • 26 November 2018

    • Make sure that parallel runs in parallelNestedSampling always generate their own starting points.
  • 28 November 2018

    • Add a new section to the example notebook that explains the goals of the package in broader terms. Also features an animated visualisation of the nested sampling algorithm. Check it out!
    • Add example of classification using logistic regression.
  • 30 November 2018

    • Add an example of Bayesian logistic classification for the Fisher Iris data set.
  • 18 December 2018

    • Add a new use case for predictiveDistribution where you can specify different keys in the 3rd argument to populate the output association with. This is useful when the inputs contain duplicates (such as can happen in the time series regression example in the last section) or when you need to undo a transformation you applied to the independent coordinates.
    • Some small updates to the example notebook.
  • 12 July 2019

    • Add code for Bayesian linear regression. See the appropriate section in the example notebook.
    • Add code for (pseudo) Bayesian regression methods using neural networks. See the appropriate section in the example notebook.

bayesianinference's People

Contributors

kubapod avatar sjoerdsmitwolfram avatar ssmit1986 avatar

Stargazers

 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.