Comments (1)
This is not an issue with ADVI + condition
syntax, but an issue with your model definition.
Internally, every ~
statement is converted into =
+ extras, and similarly .~
is converted into .=
+ extras.
This means that when you write
x .~ Normal(...)
this will result in some expression involving
x .= ...
But, as this is just standard Julia code, this won't work if x
is not yet defined!
In your "old" model, i.e.
@model function model_old(x)
s ~ InverseGamma(2, 3)
m ~ Normal(0.0, sqrt(s))
x .~ Normal(m, sqrt(s))
end
x
is provided as an argument, and is thus defined. In your new model, x
is not defined before we hit the .~
(and thus .=
).
Unfortunately this is not clear from the exception thrown.
The first thing to do when something fails with a model is to just check if you can run it without any inference, e.g.
julia> using Turing
julia> @model function model()
s ~ InverseGamma(2, 3)
m ~ Normal(0.0, sqrt(s))
x .~ Normal(m, sqrt(s))
end
model (generic function with 2 methods)
julia> model_instance = model();
julia> model_instance()
ERROR: UndefVarError: `x` not defined
...
Here we see that we get a more informative error message:)
So, the way to write a .~
+ use the new condition syntax is to specifically allocate the x
before we hit .~
:
julia> @model function model_v2(n) # need to specify the length as input
s ~ InverseGamma(2, 3)
m ~ Normal(0.0, sqrt(s))
x = Vector(undef, n)
x .~ Normal(m, sqrt(s))
end
model_v2 (generic function with 2 methods)
julia> model_instance = model_v2(10);
julia> model_instance()
10-element Vector{Float64}:
1.4266801924189414
0.8200920046396959
0.7113019610151704
1.231743385599
-0.5561762370549463
1.4947248221581675
-0.9162359604360499
-1.2980392578414817
-1.3348312021509032
-0.44033058315337587
Now, using a Vector(undef, n)
is not a great idea in general, as this leads to type-instabilities.
Following advice in the docs (https://turinglang.org/v0.30/docs/using-turing/performancetips#ensure-that-types-in-your-model-can-be-inferred), we should instead to
@model function model_v3(n, ::Type{TV}=Vector{Float64}) where {TV}
s ~ InverseGamma(2, 3)
m ~ Normal(0.0, sqrt(s))
x = TV(undef, n)
x .~ Normal(m, sqrt(s))
end
Now this will be performant + useable with AD.
As a final note: if you know that a particular variable is always going to be conditioned on and you don't need the flexibility of easily being able to change the values, etc., using the "old" syntax is probably still the way to go as it will be slightly more performant in these cases. If you want performance, .~
is always going to be slower than, say, x ~ filldist(Normal(m, sqrt(s)), length(x))
.
Hope this helps!
from turing.jl.
Related Issues (20)
- Question: I tried this tutorial. I added the package "DynamicPPL". However, when running "using DynamicPPL: settrans!", I get the error "UndefVarError: `settrans!` not defined" HOT 2
- LaTeX not rendering properly on PCA tutorial HOT 1
- `ADVI` is not up-to-date on the AD side HOT 1
- Can DifferentiationInterface be useful for Turing? HOT 4
- Wishart priors resulting in `PosDefException: matrix is not positive definite; Cholesky factorization failed`
- Model fails with an autograd error HOT 2
- `filldist` on distributions requiring `SimplexBijector` HOT 1
- init_params in v.0.30.1 HOT 2
- Issues with constrained parameters depending on each other HOT 8
- Why are we passing `chunksize=0` to `AutoForwardDiff` rather than using default? HOT 6
- ADVI errors for conditioned problems HOT 4
- Support for Float32 HOT 2
- Add an option to the `sample` function to force using `SimpleVarInfo`.
- Allow user to disable unnecessary model evaluations after #2202 HOT 5
- The stochastic differential equations example in the Bayesian DiffEq tutorial doesn't work. HOT 17
- Maths not displayed in variational inference writeup HOT 3
- Experimental Gibbs seems to have higher variance than "stable" Gibbs (maybe) HOT 3
- Better return type for mode estimation
- Undeterministic test failure HOT 2
- Tests to check that the correct AD method is actually
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from turing.jl.