GithubHelp home page GithubHelp logo

njkrichardson / saddlecity Goto Github PK

View Code? Open in Web Editor NEW
3.0 2.0 0.0 16.24 MB

A Python package to demonstrate ideas from nonlinear dynamical systems toward game theory, neural network models of associative memory, and nonlinear state space models.

Python 100.00%
game-theory neural-networks state-space-models nonlinear-dynamics

saddlecity's Introduction

Nonlinear dynamical systems for game theory, neural networks, and state space models

made-with-python Open Source Love svg2

This package uses ideas from nonlinear dynamical system theory to model and interrogate agent state trajectories within games, biological associative memory, and switching autoregressive processes.

Authors: Nick Richardson and Yoni Maltsman

Installing from git source

git clone https://github.com/njkrichardson/saddlecity.git
pip install -e saddlecity

Background

Linear dynamical systems are simple but powerful abstractions for modelling time-evolving processes of interest. Despite their broad application domain, often the linear constraint on the evolution of a state vector lacks the capacity to appropriately model processes with sufficiently nonlinear structure. The domain of nonlinear dynamical systems provides a theoretical framework to reason about dynamical systems in which this linear constraint has been lifted.

This package showcases three applications of nonlinear dynamical system theory: simulations of agent state trajectories from game theory, computational models of biological memory systems, and switching autoregressive processes from the signal processing/stochastic processes literature.

Example: Computational models of biological (associative) memory

The Hopfield network is a fully connected, unsupervised neural network designed to act as a model of associative memory. Here we summarize the distinguishing features between conventional digital memory and associative biological memory with a passage from David MacKay's Information Theory, Inference, and Learning Algorithms textbook.

  1. Biological memory is associative. Memory recall is content-addressable. Given a person’s name, we can often recall their face; and vice versa. Memories are apparently recalled spontaneously, not just at the request of some CPU.
  2. Biological memory recall is error-tolerant and robust.
    • Errors in the cues for memory recall can be corrected. An example asks you to recall ‘An American politician who was very intelligent and whose politician father did not like broccoli’. Many people think of president Bush – even though one of the cues contains an error.
    • Hardware faults can also be tolerated. Our brains are noisy lumps of meat that are in a continual state of change, with cells being damaged by natural processes, alcohol, and boxing. While the cells in our brains and the proteins in our cells are continually changing, many of our memories persist unaffected.
  3. Biological memory is parallel and distributed, not completely distributed throughout the whole brain: there does appear to be some functional specialization – but in the parts of the brain where memories are stored, it seems that many neurons participate in the storage of multiple mem- ories.

One can use saddlecity to instantiate general Hopfield networks.

from hopfield import Hopfield

net = Hopfield() 

A network can be provided memories to memorize with the add_memories method. Here we add the following three memories corresponding to binary images of handwritten digits.

drawing

Recall that this is an unsupervised neural network model; and thus doesn't require any parameter estimation. We can now use the neural network to attempt to process and decode various stimuli corresponding to corrupted versions of the learned memories.

drawing

decoded = net.decode(stimulus) 

We can then visualize the processed stimulus, which has converged to the true memory corresponding to the stimulus.

drawing

This example and more can be found in the examples directory.

References

Text references and resources can be found under the resources directory. We follow David MacKay's text as a guiding reference on Hopfield networks, and David Barber's for a reference on switching autoregressive processes (both texts are freely available online).

We utilize Scott Linderman's Python package to implement state space models and demonstrate procedures for Bayesian learning of parameters in these models.

saddlecity's People

Contributors

njkrichardson avatar ymaltsman avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar

saddlecity's Issues

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.