GithubHelp home page GithubHelp logo

thgngu / nn-bayesian-optimization Goto Github PK

View Code? Open in Web Editor NEW

This project forked from ruishu/nn-bayesian-optimization

0.0 0.0 0.0 1.86 MB

We use a modified neural network instead of Gaussian process for Bayesian optimization.

License: MIT License

MATLAB 4.60% R 1.39% Python 94.02%

nn-bayesian-optimization's Introduction

Adaptive Neural Network Representations for Parallel and Scalable Bayesian Optimization

Neural Network Bayesian Optimization is function optimization technique inpsired by the work of:

Jasper Snoek, et al
Scalable Bayesian Optimization Using Deep Neural Networks
http://arxiv.org/abs/1502.05700

This repository contains the python code written by James Brofos and Rui Shu of a modified approach that continually retrains the neural network underlying the optimization technique, and implements the technique within a parallelized setting for improved speed performance.

Motivation

The success of most machine learning algorithms is dependent the proper tuning of the hyperparameters. A popular technique for hyperparameter tuning is Bayesian optimization, which canonically uses a Gaussian process to interpolate the hyperparameter space. The computation time for GP-based Bayesian optimization, however, grows rapidly with respect to sample size (the number of tested hyperparameters) and quickly becomes very time consuming, if not all together intractable. Fortunately, a neural network is capable of mimicking the behavior of a Guassian process whilst providing a significant reduction in computation time.

Dependencies

This code requires:

Code Execution

To run the code from the home directory in parallel with 4 cores, simply call mpiexec:

mpiexec -np 4 python -m mpi.mpi_optimizer

To run a sequential version of the code:

python -m sequential.seq_optimizer

To run the gaussian process version of Bayesian optimization:

python -m sequential.seq_gaussian_process

Sample output:

Randomly query a set of initial points...  Complete initial dataset acquired
Performing optimization... 
0.100 completion...
0.200 completion...
0.300 completion...
0.400 completion...
0.500 completion...
0.600 completion...
0.700 completion...
0.800 completion...
0.900 completion...
1.000 completion...
Sequential gp optimization task complete.
Best evaluated point is:
[-0.31226245  3.80792522]
Predicted best point is:
[-0.31226245  3.7755048 ]

Note: The code, as written, focuses the use of the algorithm on any black-box function. A few common functions are available in learning_objective. The chosen function is set in hidden_function.py. To really appreciate the time-savings gained by the parallelized code, it is important to realize that evaluating a real-world black-box function (i.e. computing the test performance for an ML algorithm with a given set of hyperparameters) takes time.

This can be simulated by uncommenting the line: # time.sleep(2) in hidden_function.py.

nn-bayesian-optimization's People

Contributors

ruishu avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.