GithubHelp home page GithubHelp logo

patricktourniaire / negative-gaussian-mixture-model Goto Github PK

View Code? Open in Web Editor NEW
1.0 2.0 0.0 34.99 MB

Undergraduate honours project exploring learning Gaussian Mixture Models with negative components.

Jupyter Notebook 97.62% Python 2.32% Shell 0.05%
mixture-models probabilistic-circuits gmm negative-gmm squared-circuit

negative-gaussian-mixture-model's Introduction

Negative Gaussian Mixture Components

This was my undergraduate thesis exploring a way to build a generalisable framework to allow for negative components in Gaussian Mixtures. Which served as the preliminary work for the final framework "Negative Mixture Models via Squaring: Representation and Learning" by Lorenzo Loconte et al. as apart of the research group of my supervisor Antonio Vergari.

Breif Motivation

The details of this framework and final model is detailed in my thesis (which is found under docs/thesis.pdf). However, to appreciate the achievements of this project this section will highlight the motivation behind this project.

The basic idea is rather simple. Imagine you have some complex distribution of points to model with a GMM (complex in that you will need a lot of components to model), let's take the example of a donut shape. Using a traditional GMM you will have to place a lot of components along the ridge of the donut to get a nice uniform distribution.

However, as humans it makes sense to simply have a large positive component encapsulating all the points, and then subtracting away all the probability mass in the middle (sort of the same way you actually make a donut in real life). The problem here is that the final mixture has to be a valid PDF, which implies the whole mixture has to integrate to 1 and that there should be no negative values along the PDF. Which makes the task of subtracting away the probability mass in the middle tricky, however, that is what this project has achieved and is outlined in my thesis found under docs/thesis.pdf.

The table below demonstrates how a traditional GMM is not able to model the donut with only 3 components on the left, and the right side demonstrates how the Negative-GMM (NGMM) is able to effectively model this complex distribution.

3-Comp. GMM 3-Comp. GMM w/ Negative Components
2 Component GMM 3 Component NGMM

Thus, it is trivial to see that this framework introduces a lot of flexibility in terms of modeling complex datasets with exponentially fewer components and implicitly fewer parameters to optimize. The framework that I built for this was constructed for a high level of generalisability, to support any exponential parametric distributions and deeply-nested mixtures. Which Lorenzo Loconte et al. accomplished with my experiments being the preliminary step for this generalisation.

Getting Started

Dependencies

The project depends on Weights and Biases (WandB) for logging. This package is included in the requirements, however it is required that you login with your own account.

Setup - PIP

  1. Setup conda environemt with required packages
pip install -r requirements.txt
  1. Login to your WandB account
wandb login
  1. In experiment_builder.py update line 87-91 with your WandB project and entity/username.
wandb.init(
    project=<project_name>,
    entity=<username/entity>,
    config={**model_config}
)

Setup - Conda Environment

  1. Setup conda environemt with required packages
conda env create -f environment.yml
  1. Login to your WandB account
wandb login
  1. In experiment_builder.py update line 82-86 with your WandB project and entity/username.
wandb.init(
    project=<project_name>,
    entity=<username/entity>,
    config={**model_config}
)

Executing program

  • Run a local experiment using one of the dataset shapes provided under data/. See local_scripts/ to see how to run an experiment with any of these shapes.
chmod +x local_script/<experiment_name>.sh && ./local_script/<experiment_name>.sh

Help

If you encounter any issues please feel free to reach out by email [email protected]

Authors

Related Paper

This project served as a preliminary set of experiments and exploration for the paper "Negative Mixture Models via Squaring: Representation and Learning" by Lorenzo Loconte et al.

Acknowledgments

Inspiration, code snippets, etc.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.