sheffieldml / deepgpy Goto Github PK
View Code? Open in Web Editor NEWDeep GPs with GPy
License: BSD 3-Clause "New" or "Revised" License
Deep GPs with GPy
License: BSD 3-Clause "New" or "Revised" License
I would like to reproduce the results in figure 4 of the paper "Nested Variational Compression in Deep Gaussian Processes" from "Nested Deep GPs.ipynb" notebook but model.plot() does not show any figure. Please what am i doing wrong?
on line model1 = build_supervised(X, Y, Qs=(1,), Ms=(15,15)) in the notebook example, I get an error that comes down to:
in compute_KL_term in layers.py, line 167, the sizes of self.q_of_U_precision and self.Kmmi[:,:,None] do not align
self.q_of_U_precision is (15,15,15) and
self.Kmmi[:,:,None] is (15,15,1)
So for the one hidden layer the example provides the following:
model1 = build_supervised(X, Y, Qs=(1,), Ms=(15,15))
model1.optimize('bfgs', max_iters=1000, messages=1)
What would be the code required to perform stochastic optimization for this example?
model1.optimize_SGD()
doesn't work; returns the error NameError: global name 'SGD' is not defined
Thanks
Hi guys,
I had tested deepGPy with my regression problem. The problem data has 15 feature columns. I tested both one-hidden layer GP and two-hidden layer GP that both provides almost identical mean predictions. The same problem was tested with both full GPR and sparse GPR. The results looked quite alright.
The network configuration for one-hidden layer is shown as follows:
--> Z is a m x 15 inducing set, m=20
D = X.shape[1] # which is 15
layer_X = InputLayerFixed(X, input_dim=D, output_dim=D, kern=GPy.kern.RBF(D, ARD=False), Z=Z, beta=0.01, name='layerX')
layer_Y = ObservedLayer(Y, input_dim=D, output_dim=1, kern=GPy.kern.RBF(D, ARD=False), Z=Z, beta=0.01, name='layerY')
layer_X.add_layer(layer_Y)
m = ColDeep([layer_X, layer_Y])
layer_X.Z.fix()
--> Predict
mu = m.predict_means(X_test)[0]
I wondered if there are something I did wrongly for my deep GP network?
Many thanks!
If I run the step_fn_demo function as is there is an issue in special_einsum:
Traceback (most recent call last):
File "step_fn_demo.py", line 52, in
m = ColDeep([layer_X, layer_Y])
File "/Users/danielmarthaler/GPy/GPy/core/parameterization/parameterized.py", line 27, in call
self.parameters_changed()
File "/Users/danielmarthaler/deepGPy/coldeep.py", line 19, in parameters_changed
self.layers[0].feed_forward()
File "/Users/danielmarthaler/deepGPy/layers.py", line 354, in feed_forward
[l.feed_forward(self.q_of_X_out) for l in self.lower_layers]
File "/Users/danielmarthaler/deepGPy/layers.py", line 461, in feed_forward
self.previous_layer.feed_backwards(X_mean_grad, X_var_grad)
File "/Users/danielmarthaler/deepGPy/layers.py", line 364, in feed_backwards
self.backpropagated_gradients(dL_dmean, dL_dvar)
File "/Users/danielmarthaler/deepGPy/layers.py", line 223, in backpropagated_gradients
dL_dS = special_einsum(self.psi1Kmmi, dL_dvar)
File "/Users/danielmarthaler/deepGPy/special_einsum.py", line 54, in special_einsum
weave.inline(code, ['N','M','D','res','A','B'], type_converters=weave.converters.blitz, support_code='#include <omp.h>', *_opts)
File "/usr/local/lib/python2.7/site-packages/scipy/weave/inline_tools.py", line 366, in inline
*_kw)
File "/usr/local/lib/python2.7/site-packages/scipy/weave/inline_tools.py", line 502, in compile_function
exec('import ' + module_name)
File "", line 1, in
ImportError: dlopen(/Users/danielmarthaler/.cache/scipy/python27_compiled/sc_9354f24db217fdec00d0dd1964056e7a5.so, 2): Symbol not found: __ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE9_M_createERmm
Referenced from: /Users/danielmarthaler/.cache/scipy/python27_compiled/sc_9354f24db217fdec00d0dd1964056e7a5.so
Expected in: flat namespace
in /Users/danielmarthaler/.cache/scipy/python27_compiled/sc_9354f24db217fdec00d0dd1964056e7a5.so
Which basically comes down to my compiler not finding omp.h (I think) which is a known issue elsewhere in the GPy stack
In coldeep.py there is a line:
from GPy.util.choleskies import indexes_to_fix_for_low_rank
But within the package there is a choleskies.py file containing the same function. Do you mean to use the GPy version?
Hi, when running the iPython example, deepGPy hangs on not finding GPy.util.choleskies What version of GPy is being asserted here?
Hi guys,
I was trying to use deepGPy on OSX, I noticed that the code attempts to use openMP which is not supported by OSX (it uses LLVM) which throws error of <omp.h> not found
. So I used a Homebrew version Clang version wrapper clang-omp and manually copied the header file and some libs to the include path of my python env.
The code (I used 'step_fn_demo.py') finally works without using 'extra_link_args' and 'Libraries'. I was wondering is this a right way to put up deepGPy on OSX?
Many thanks.
Is it possible to use this package for classification or is it restricted to regression at the moment?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.