GithubHelp home page GithubHelp logo

quantecon / expectations.jl Goto Github PK

View Code? Open in Web Editor NEW
57.0 6.0 15.0 272 KB

Expectation operators for Distributions.jl objects

License: MIT License

Julia 93.58% TeX 6.42%
julia statistics distributions quadrature

expectations.jl's Introduction

CI codecov

Expectations

Installation (for Julia v1.0 and up):

pkg> add Expectations

See Pkg docs for more details

This is a package designed to simplify the process of taking expectations of functions of random variables.

Expectation Operator

The key object is the expectation function, which returns an operator:

dist = Normal()
E = expectation(dist)
E(x -> x)

For convenience, the operator can be applied directly to a function instead of being cached,

expectation(x->x^2, dist)

As a linear operator on vectors using the nodes of the distribution

dist = Normal()
E = expectation(dist)
x = nodes(E)
f(x) = x^2
E * f.(x) == dot(f.(x), weights(E))

Random Variables

The underlying distributions are objects from Distributions.jl (currently <:UnivariateDistribution).

Starting with 1.3.0, we also support mixture models.

Quadrature Algorithms

We support different types of Gaussian quadrature (Gauss-Hermite, Gauss-Legendre, Gauss-Laguerre, etc.) based on the distribution, as well as some methods with user-defined nodes (e.g., trapezoidal integration).

We have rules for the following distributions:

  • Normal
  • ChiSq
  • LogNormal
  • Exponential
  • Beta
  • Gamma/Erlang
  • Uniform
  • Continuous Univariate (compact; generic fallback)
  • Continuous Univariate (no restriction; approximates with quantile grid)
  • Discrete

See docs for more info.

Mixture Models

We also support mixture models, e.g.

d = MixtureModel([Uniform(), Normal(), Gamma()]);
E = expectation(d);
E(x -> x) # 0.5000000000000016

The MixtureExpectation objects support most of the same behavior as the individual IterableExpectations.

2E(x -> x) # 1.000000000000003
weights(E) # [1/3, 1/3, 1/3]
expectations(E) # component expectations

expectations.jl's People

Contributors

arnavs avatar eirikbrandsaas avatar fuzhiyu avatar github-actions[bot] avatar jlperla avatar juliatagbot avatar nosferican avatar oxinabox avatar sebastiangomez87 avatar tbeason avatar yolsever avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

expectations.jl's Issues

Chi Distribution

Similar quadrature rule to #34 (Gauss-Laguerre), just do a change of variables.

[bug] Weird warning during precompilation + Cannot precompile on nightly.

Hey,

During precompilation it warns that a method is defined twice in the same file :

┌ WilliamsonTransform [48feb556-9bdd-43a2-8e10-96100ec25e22]
│  WARNING: Method definition *(Real, Expectations.MixtureExpectation{ET, WT} where WT where ET) in module Expectations at [...]/mixturemodels.jl:34 overwritten at [...]/mixturemodels.jl:15.
│    ** incremental compilation may be fatally broken for this module **
└  

And indeed, on lines 15 and 34 the same code appears, the second one has an extra comment. Should I care ?


Putting my work on github there https://github.com/lrnv/WilliamsonTransform.jl and trying to use CI i get caught by this on julia nightly, where apparently it is straigth up an error :

WARNING: Method definition *(Real, Expectations.MixtureExpectation{ET, WT} where WT where ET) in module Expectations at /home/runner/.julia/packages/Expectations/7lvqQ/src/mixturemodels.jl:15 overwritten at /home/runner/.julia/packages/Expectations/7lvqQ/src/mixturemodels.jl:34.
ERROR: LoadError: Method overwriting is not permitted during Module precompile.
Stacktrace:
 [1] top-level scope
   @ ~/.julia/packages/Expectations/7lvqQ/src/mixturemodels.jl:30
 [2] include(mod::Module, _path::String)
   @ Base ./Base.jl:497
 [3] include(x::String)
   @ Expectations ~/.julia/packages/Expectations/7lvqQ/src/Expectations.jl:3
 [4] top-level scope
   @ ~/.julia/packages/Expectations/7lvqQ/src/Expectations.jl:14
 [5] include
   @ Base ./Base.jl:497 [inlined]
 [6] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt128}}, source::String)
   @ Base ./loading.jl:2311
 [7] top-level scope
   @ stdin:3
in expression starting at /home/runner/.julia/packages/Expectations/7lvqQ/src/mixturemodels.jl:30
in expression starting at /home/runner/.julia/packages/Expectations/7lvqQ/src/Expectations.jl:3
in expression starting at stdin:3

See the full log there : https://github.com/lrnv/WilliamsonTransform.jl/actions/runs/6654122587/job/18081547724

Gauss-Legendre quadrature for arbitrary bounded continous univariate functions

For this, write the Gaussian algorithm for the UnivariateContinuousDistribution to use Gauss-Legendre quadrature. This works for any bounded support distribution, even if we want to do further specializations.

Note that if we setup the types correctly, then concrete types which are subsets of UnivariateContinuousDistribution (or whatever it is called in Distributions.jl) would pick up their own specializations. In particular, normal, lognormal, etc.

For the quadrature approach itself:

  • Need to throw a method-error if the support of the distribution is unbounded.
  • See http://quantecon.github.io/QuantEcon.jl/latest/api/QuantEcon.html#QuantEcon.qnwlege-Tuple{Int64,Real,Real} for the call in QuantEcon, which can be swapped out later.

It may make sense to use this as the default algorithm for all continuous univariate functions, and not even implement the Adaptive quadrature.

Remove deprecations from the unit tests in preparation for v1.0

I don't think there are any deprecations in the current /src, but there seems to be at least one in the tests. Doing a Pkg.test("Expectations") gives output that includes

WARNING: importing deprecated binding Base.e into Expectations.
┌ Warning: `linspace(start, stop, length::Integer)` is deprecated, use `range(start, stop=stop, length=length)` instead.
│   caller = top-level scope at none:0
└ @ Core none:0
┌ Warning: `linspace(start, stop, length::Integer)` is deprecated, use `range(start, stop=stop, length=length)` instead.
│   caller = top-level scope at none:0
└ @ Core none:0
Test Summary:          | Pass  Total
Iterable distributions |   38     38
   Testing Expectations tests passed

I am not sure where the e is used, but the linspace is easy enough to replace... using Compat as required.

Generic types for the `AbstractIterableExpectation` type and `OrdinalRange` for regular grids.

This is useful for #6 primarily, but also #4.

I did a little research for the abstract types we would want for the weights and nodes, and the ranges used in some of this.

The AbstractRange type has the function step(r)

The abstract type OrdinalRange seems to be what we are thinking about for representing ranges for grids. The functions we can rely on in its interface are:

  • first(r) to get the first element of r
  • last(r) to get the last element of r
  • length(r) to get the number of steps in r.
  • iterate(r) to iterate it (which is typically just used in loops or higher order functions using loops).

However, there is also LinRange <: AbstractRange which seems to have the same set of methods, but would be missing if we only used OrdinalRange.

So, my suggestion is that dispatching relies on AbstractRange for now, but assumes these all exist, and is likely to change to OrdinalRange if LinRange is deprecated or changed.

What this means is that if we are designing the interface with the fixed grid (e.g. Trapezoidal) then we can write just pass in that range to encompass all information about the range. For example,

  • expectation(d, 0.0:0.1:1.0) for some d a univariate, continuous distribution or expectation(d, linspace(0.0, 1.0, 10))
  • This would dispatch on the AbstractRange for the 2nd parameter, with the Trapezoidal as the default.
  • We know that this is a constant step size algorithm, so can use the trapezoidal rules for a regular grid (i.e., the 1, 2, 2, ... 1 weighting). It should throw an ArgumentError if first(r) != minimum(d) and `last(r) != maximum(d). We could also define the 3/8 simpsons rule and other versions relatively easily.
  • On the other hand, if we were passed expectation(d, [0.0 0.1 0.2 0.33 0.4 0.8 1.0]) then we could dispatch based on AbstractArray instead, and use the irregular trap rules. And if the user went expectation(d, collect(0.0:0.1:1.0)) then it would become an array and there would be no way to turn this into a regular grid.

Add the convenience call with a function

One small thing first: the idiom in Julia is generally for simple functions to avoid the function, return and end. That is instead of

function expectation(dist::D, alg::Type{FiniteDiscrete} = FiniteDiscrete; kwargs...) where {D <: DiscreteUnivariateDistribution}
    return _expectation(dist, alg; kwargs...)
end

you would do something like

expectation(dist::D, alg::Type{FiniteDiscrete} = FiniteDiscrete; kwargs...) where {D <: DiscreteUnivariateDistribution} = _expectation(dist, alg; kwargs...)

Now, I think we can now add in the overload of expectation which takes in a function. Basically this just mak

I believe you will need one of these for every expectation call, since otherwise it would not have the default parameters handled correctly. Not a big deal. For example in the above one it is something like

expectation(f::Function, dist::D, alg::Type{FiniteDiscrete} = FiniteDiscrete; kwargs...) where {D <: DiscreteUnivariateDistribution} = expectation(dist, alg;kwargs...)(f)

Design the basic interface with swapping out algorithms and parameters

cc: @Nosferican @sglyon

The idea is to create a simple library that will generate a higher-order function for calculating expectations (i.e., mapping to the expectation operator in the math). This could become a very complicated library if it (efficiently) dealt with all sorts of conditional and multi-variate exponentials, so lets keep things simple and start with univariate ones.

A starting point for the implementation is https://github.com/jlperla/ECON407_2018/blob/master/notebooks/expectations_integration.ipynb which I used in rewriting some of the QuantEcon julia lecture notes (e.g. my Optimal growth rewrite in https://github.com/jlperla/ECON407_2018/blob/master/notebooks/optgrowth.ipynb )

As an example, the current implementation is something like

using QuantEcon, Distributions, QuadGK, NamedTuples, BenchmarkTools
function expectation(dist::Distributions.Normal{T}; n=20, kws...) where {T}
    nodes, weights = qnwnorm(n, [mean(dist)],[var(dist)])
    return f -> dot(f.(nodes), weights)  #Returns a function (of a function f)
end

#Then, for arbitrary distributions with a pdf function, we can  Gauss-Kronod adaptive quadrature.
#This is very precise, but does more calculations by default
function expectation(dist::D; kws...) where {D <: ContinuousUnivariateDistribution}
    #Uses Gauss-Kronod adaptive quadrature
    return f ->
        quadgk( x-> f(x) * pdf(dist,x) , minimum(dist), maximum(dist); kws...)[1] 
end

#For a discretely valued expectation, could just do the sums as a dot-product
function expectation(dist::D; kws...) where {D <: DiscreteUnivariateDistribution}
    @assert hasfinitesupport(dist) #Otherwise might need to modify to use a truncated pdf.    
    distsupport = support(dist)
    return f -> dot(pdf.(dist, distsupport), f.(distsupport))  
 end

#For convenience, version that takes a function as the first parameter
#Less useful if you want to cach the nodes/weights, etc., but notationally simple.
function expectation(f, dist::D; kws...) where {D <: UnivariateDistribution}
    E = expectation(dist; kws...) #Gets the appropriate expectation operator for dist
    return E(f) #calls the expectation with f
end

To use it, the current code looks like this.

#Can get an E and use it
dist = Normal(0.1, 2)
E = expectation(dist)

meanx = E(x -> x)
@show meanx

#Can define and pass in functions as well, not just anonymous
g(x) = x

varx = E(x -> x^2) - E(g)^2 #i.e. calculate the variance
@show varx

#Or can call `expectation` directly with a function as the first parameter
meanx2 = expectation(x -> x, dist)
@show meanx2

The problem with this simple design is that it makes it difficult to separate out the quadrature algorithm from the distribution itself. What about this design instead for the interface?

abstract type QuadratureAlgorithm end
type AdaptiveQuadrature <: QuadratureAlgorithm end #e.g. Gauss-Kronod
type GaussianQuadrature <: QuadratureAlgorithm end; #e.g. n-point gaussian, Gauss-Laguerre, etc.
type FiniteDiscrete <: QuadratureAlgorithm end# e.g. baseline simple one for discrete, finite distributions

Then, we can use the variable argument keywords for parameters, providing defaults as required. For example, something like

function expectation(dist::D, q::AdaptiveQuadrature; kws...) where {D <: ContinuousUnivariateDistribution}
    #Uses Gauss-Kronod adaptive quadrature
    return f ->
        quadgk( x-> f(x) * pdf(dist,x) , minimum(dist), maximum(dist); kws...)[1] 
end

function expectation(dist::Distributions.Normal{T}, q::GaussianQuadrature; n=20, kws...) where {T}
    nodes, weights = qnwnorm(n, [mean(dist)],[var(dist)])
    return f -> dot(f.(nodes), weights)
end
function expectation(dist::Distributions.Normal{T}, q::Q = GaussianQuadrature() ; kws...) where {T, Q <: QuadratureAlgorithm } #default
  expectation(dist, q ; kws...)
end

etc.

Should the newton-coates work with an infinite support distribution?

The docs had the following, which I removed:

If nodes are given, it will calculate using Newton-Coates quadrature (e.g. Trapezoidal)

x = -10:0.2:10
f(x) = x^2
E = expectation(dist, x)
3 * E(f) == 3 * E * f.(x)

But is that valid? the dist in that case was a Normal. If that works, then I think we need to be more careful and reject if the bounds don't match, etc.

JuliaCon Comments

From @theogf:

Expectations.jl being a (very nice) wrapper around quadrature methods, it would feel quite natural to add Monte Carlo integration as well. My point is that for D > 3, quadrature method become very inefficient. Having a unified approach for expectations that contain both MC and Quadrature would be awesome in my opinion.

From Chris:

make sure to check out the DiffEqUncertainty talk about probabilistic optimization

Feature request: Addition of expectations

This is a follow-up to #42. I just realized that MixtureExpectation implements addition of expectations:

julia> e1 = expectation(Normal(1.0))
julia> e2 = expectation(Uniform())
julia> e1pe2 = MixtureExpectation([e1,e2],[1.0,1.0])
julia> e1pe2(identity) == e1(identity) + e2(identity)
true

So adding methods for +(e1::Expectation, e2::Expectation) would be both elegant from a mathematical perspective (especially since * is already implemented) and also quite practical/useful!

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

Convert to using FastGaussQuadrature.jl

See https://github.com/ajt60gaibb/FastGaussQuadrature.jl

The idea is to remove the QuantEcon dependency when possible (and when the underlying tests are passing).

https://github.com/pkofod/DistQuads.jl/blob/master/src/DistQuads.jl has some of the algebra completed, which you can copy. The key ones to replace in QuantEcon are:

Conditional expectation with markov chains

See https://lectures.quantecon.org/jl/ifp.html

A key type of discrete expectation is something like
E(f(X_{t+1} | X_t = x_t) for a discrete distribution.

My thought is that this can be efficiently handled in the interface by something along the lines of

P = #... get markov chain in one way or another
E = conditional_expectation(P)

#Now this can be called as
```julia
#E(f(X_{t+1}) | X_t = 2)
E(x-> f(x), 2)
#Or if you have the vector of nodes `x` matching the iteration.
E(2) * f.(x)

It may also be possible to hook into the MarkovChain features in QuantEcon such as that of https://github.com/QuantEcon/QuantEcon.jl/blob/785c71b99b04c49c8638d499591584c1e33f3745/src/markov/mc_tools.jl#L19-L30 for the "state vector" instead of giving the ordinal position as above

Test Case Pass

@jlperla I took a stab at implementing the test case you set up on Discourse. This is pretty rough, but gets the job done. Let me know what you think.

using Base.Test

# Type defs. 
abstract type AbstractValue end 
struct Value1 <: AbstractValue end 
struct Value2 <: AbstractValue end 

abstract type AbstractAlgorithm end
struct Algorithm1 <: AbstractAlgorithm end
struct Algorithm2 <: AbstractAlgorithm end

# Initialize objects. 
v1 = Value1()
v2 = Value2()
a1 = Algorithm1()
a2 = Algorithm2()

# Write function methods
function f(v::Value1; alg = a2, kwargs...)
    return f(v, alg; kwargs...)
end

function f(v::Value1, alg::Algorithm2; N = 5, kwargs...)
end 

function f(v::Value1, alg::Algorithm1; N = 10, kwargs...)
end

function f(v::Value2; alg = a1, kwargs...)
    return f(v, alg; kwargs...)
end 

function f(v::Value2, alg::Algorithm1; N = 10, kwargs...)
end 

function f(v::Value2, alg::Algorithm2; kwargs...)
    MethodError("This Method is ill-defined.")
end

# Run tests. 
@testset "Dispatch Behavior" begin 
    @test f(v1) == f(v1, a2, N=5) #i.e. defaults to Algorithm 2 with N=5 as the default
    @test f(v1, N = 7) == f(v1, a2, N=7) #Can change the default value
    @test f(v1, a1) == f(v1, a1, N=10) #can use use Algorithm1 (with the different default)
    @test f(v2) == f(v2, a1, N=10) #i.e. uses algorithm 1 by default
    @test f(v2, N = 8) == f(v2, a1, N=8) #Can change the default value
    @test_throws MethodError f(v2, a2) #Not defined! Should throw
end 

performance of IteratableExpectation

I find the performance of IteratableExpectation leaves quite some room for improvement.

using Expectations
g(a) = a^2
E = expectation(Normal(), Gaussian; n =10)
@btime g(1.0) #   0.015 ns (0 allocations: 0 bytes)
@btime $E($g) # 247.751 ns (6 allocations: 256 bytes)

it's >1000x slower than it should be.

The source code of expectation is as follows:

function (e::IterableExpectation{NT,WT})(f::Function; kwargs...) where {NT,WT}
    applicable(f, rand(nodes(e))) || throw(ArgumentError("The function doesn't accept elements from the distribution's support."))
    return dot(f.(nodes(e)), weights(e))
end

Here is the implementation in the source. I guess at least we could kill unnecessary allocation by using for loops instead of dot product. Also, as it's often called in the innermost loop of a performance sensitive code, probably it's better not to check whether f is applicable every time? At least we could avoid rand.

Here is my own version used in projects:

function (e::IterableExpectation{NT,WT})(f::Function, ::Nothing) where {NT,WT}
    # nvec, wvec = nodes(e), weights(e)
    nvec, wvec = nodes(e), weights(e)
    E = f(nvec[1]) * wvec[1]
    @inbounds for i in 2:length(nvec)
        n, w = nvec[i], wvec[i]
        E += f(n) * w
    end
    return E
end

@btime $E($g, nothing) # 9.722 ns (0 allocations: 0 bytes)

The performance makes much more sense now.

Pkg warning about LinearAlgebra dependency

I get the following warning when running using Expectations:

julia> using Expectations
[ Info: Precompiling Expectations [2fe49d83-0758-5602-8f54-1f90ad0d522b]
┌ Warning: Package Expectations does not have LinearAlgebra in its dependencies:- If you have Expectations checked out for development and have
│   added LinearAlgebra as a dependency but haven't updated your primary
│   environment's manifest file, try `Pkg.resolve()`.
│ - Otherwise you may need to report an issue with Expectations
└ Loading LinearAlgebra into Expectations from project dependency, future warnings for Expectations are suppressed.

I think it's because LinearAlgebra is listed under [extras] rather than under [deps]:

[extras]
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"

It seems a simple fix to add the dependency -- is there any reason why this shouldn't be done?

Erlang

Erlang dist is a special case of Gamma dist with integer first parameter. The existing implementation that works for Gamma should also be used for Erlang then, correct? Right now it just fails.

I naively encountered this because the sum of Exponential draws has an Erlang distribution.

Multiplication by an expectation object as inner product?

Think through a common example with #6

using Distributions
x_max = 2.0
d = Truncated(Exponential(0.1), 0.0, x_max) #truncated exponential
x = [linspace(0.0, 1.0, 10); linspace(1.01, x_max, 20) #match support
E = expectation(d, x, :trapezoidal)

#Calculate expectation E(x^2)
h(x) = x^2
val = E(h)

#Or using the weights yourself.
val = dot(weights(E), h.(x))

Now, in algorithms we often have the h(x) function pre-calculated on the set of nodes x. In that case, consider the notation

hvals = h.(x) #presumably calculated in a different 
E_val = E * hvals
E_val == dot(weights(E), h.(x)))

To do this for an expectation operator (subtype of AbstractExpectation), you just need to add something like

Base.*(E::AbstractExpectation, v::AbstractVector) = dot(weights(E), v)

Way to get the weights and nodes for an expectation operator?

It is very convenient to do the following

E = expectation(Normal(0.0, 1.0))
E(x-> x^2)

However, sometimes you would prefer to instead directly get the nodes and weights for the expectation operator (which is then a linear operator).

Because of this, instead of expectation(...) returning a function, perhaps it should return an operator "type" which is callable (and hence can act as a function) but has other methods. In particular,

E = expectation(Normal(0.0, 1.0))
E(x-> x^2)
nodes(E) #returns a vector of the underlying nodes associated with the operator
weights(E) #returns the weights.

Benchmark qnwdist implementations

There was an idea for writing this function by linspace(0.0, 1.0, n+2)[2:n+1], versus linspace(q0, qN, n). There are also two different methods in the QuantEcon.jl library for the spacing. At some point, we should test these and see how they perform.

JOSS Reviewer Feedback

  • Add overview of Python, R equivalents.
  • Add documentation (and potentially methods) for generic fallback on various Distributions.jl extensions (e.g. TuringLang/Bijectors.jl)
  • Contextualize inside world of PPLs.

Also:

  • Add Measurements.jl etc. style stuff to paper.

Feature request: Finite mixture distributions

I really love this package!

One feature that would be great to have would be to extend expectation to mixture models. For example, the following cannot be integrated by this package right now:

 MixtureModel([Uniform(), Normal()])

But each individual component can be integrated, so it is possible to manually average these two. It would be great if this could be automated through the expectation operator.

Some readings on Gaussian quadrature

https://en.wikipedia.org/wiki/Gaussian_quadrature

There are a few key points about Gaussian quadrature:

  • These are able to integrate functions with infinite support on one or both ends. This is important when finding the expectations of distributions defined on [-infinity, infinity], [0, \infinity], etc. So you will need to take care on choosing the right one
  • See https://juliastats.github.io/Distributions.jl/latest/univariate.html and in particular the maximum and minimum of the support
  • Sometimes the weights of the PDF and the weights of the quadrature nodes are easily calculated together (or the libraries do it all at once). What is important for calculating the expectations is the ability to cache the weighting vector and nodes, so you can use the appropirate code/library to do it.

[feature request] Generic fallback for non-supported distributions ?

It would be neat if there was a generic fallback for non-supported distributions. I can come up with a simple solution for continuous distributions:

# fallback for UnivariateContinuousDistribution
import Cubature
pcubature(x * Distributions.pdf(X,x), minimum(X), maximum(X))

but i need to think a bit more for discrete infinite supports.

Edit: There is ageneric fallback in Distributions.jl implemented there: https://github.com/JuliaStats/Distributions.jl/blob/master/src/functionals.jl

So maybe this is just an interface issue, Expectations.jl should not throw when it does not know the distribution but simply use the Distributions.jl one. Thanks @devmotion for the link.

Bug when computing simple expectations

I have encountered a bug today:

using Distributions, Expectations
E    = expectation(Normal(1, 1))
E(x -> x)

julia> E(x -> x)
5.778544334733716e-17

where it clearly should be 1. I can have a look tomorrow to figure out what is happening.

Expectation with pre-existing grid and the trapezoidal rule

Something that comes up a lot in macro is the need to calculate expectations where a (often irregular) grid is predefined. For example, something like

using Distributions
x_max = 2.0
d = Truncated(Exponential(0.1), 0.0, x_max) #truncated exponential
x = [linspace(0.0, 1.0, 10); linspace(1.01, x_max, 20) #match support
E = expectation(d, x, :trapezoidal)

#Calculate expectation E(x^2)
h(x) = x^2
val = E(h)

#Or using the weights yourself.
val = dot(weights(E), h.(x))

Note that there could be dispatching in expectation(d, x, :trapezoidal) depending on if x is an AbstractVector vs. a StepRangeLen or UnitRange. In the case of the latter, it could provide the (simpler) regular grid trapezoidal rule. Which should probably be a lazy iterator rather than an actual vector.

AppVeyor Support

It would be nice to configure AppVeyor on this repository for Windows CI. Not urgent by any means. But then we would have coverage for all 3 OS's.

Ensure coverage of QuantEcon lecture notes expecations

@Nosferican I tried to go through all of the expectations used in the QuantEcon lectures, so here is a list of the things that @arnavsood could target to make sure the library can handle lecture note rewrites.

A few clear priorities:

  • Arbitrary univariate discrete distributions (relatively easy, with argumenterror if unbounded)
  • univariate normal distribution with gaussian quadrature (I think that you can start with the QuantEcon quadrature nodes/etc.)
  • univariate lognormal distribution with gaussian quadrature
  • univariate quadrature on an arbitrary function with bounded support (i.e. gauss-legendre in general). see #11
  • univariate quadrature for a beta distribution (i.e. a specialization instead of relying on #11). See http://quantecon.github.io/QuantEcon.jl/latest/api/QuantEcon.html#QuantEcon.qnwbeta-Tuple{Int64,Real,Real} provides weights and nodes for the beta distribution?
  • conditional distributions from a markov-chain, i.e. see #12, although not worth it quite yet.

I looked in the lectures for key places that QuantEcon expectations come up, to make sure these can be fulfilled:

Compat.LinearAlgebra is deprecated

Got this warning when using Expectations for the first time (at precompilation):

WARNING: Compat.LinearAlgebra is deprecated, use LinearAlgebra instead.
  likely near ~/.julia/packages/Expectations/RkwWa/src/Expectations.jl:7

Uniform Distribution

This will be the vanilla rule for compact domains, e.g. Gauss-Legendre quadrature.

Convenience expectation on functions is not working with Normals?

On v0.7

julia> using Expectations

julia> dist = Normal()
Normal{Float64}=0.0, σ=1.0)

julia> E = expectation(dist);
julia> E(x-> x)
2.889272167366858e-17

julia> expectation(x->x, dist)
ERROR: MethodError: no method matching expectation(::getfield(Main, Symbol("##23#24")), ::Normal{Float64})
Closest candidates are:
  expectation(::D<:Distribution{Univariate,Continuous}, ::NT) where {D<:Distribution{Univariate,Continuous}, NT} at C:\Users\jlperla\.julia\packages\Expectations\062z\src\iterable.jl:133
  expectation(::D<:Distribution{Univariate,Continuous}, ::NT, ::Type{#s13} where #s13<:ExplicitQuadratureAlgorithm; kwargs...) where {D<:Distribution{Univariate,Continuous}, NT} at C:\Users\jlperla\.julia\packages\Expectations\062z\src\iterable.jl:133
Stacktrace:
 [1] top-level scope at none:0

I would add these sorts of things to the tests. Should https://github.com/econtoolkit/Expectations.jl/blob/master/src/iterable.jl#L273 perhaps be calling _expectation instead of expectation?

I would remove the _expectation export from https://github.com/econtoolkit/Expectations.jl/blob/master/src/Expectations.jl since it is a private function. Finally, are you sure we want to do the following? If the tests pass, I think it may be better to let the users of the library deal with their namespaces.

using Reexport
@reexport using Distributions

Improve Test Coverage

We could do better on this. Maybe a good RA project would be to write tests (I think for the trapezoidal methods, and the algorithm dispatch?) and check them into the repo through PR.

cc @jlperla

[feature request] The pareto case

The Pareto case could be handled by using the fact that if $X$ is Pareto-dsitributed with minimal value $m$ and shape index $\alpha$, then

$$Y = log(X) - log(m)$$ is exponentially distributed with rate parameter $\alpha$. Therefore, for any function $f$,

$$\mathbb E(f(X)) = \mathbb E(f(m * e^Y))$$

and we can re-use the quadrature scheme of $Y$, which follows a Exponential(\alpha) distribution.

Finish up linearity steps

To finish #7 we need to add scalar multiply and vector addition. I think these all just apply the exact function on the weights. Later

Update Readme

Add using Expectations etc.
General syntax: expectation(x->f(x), Distribution(θ))

using Expectations, Distributions;
E=expectation;
const μ=0.0; const σ=1.0;
E(x->x, Normal(μ,σ))
E(x->exp(x), Normal(μ,σ))
exp+^2)/2.)
E(x->x^2, Normal(μ,σ))
E(x->x^2+exp(x), Normal(μ,σ))
#
D = [Uniform(), Normal(), Gamma()];
d = MixtureModel(D);
E.(x->x, D)
E(x->x, d)     #expectation of a mixture
( E(x->x,D[1]) + E(x->x,D[2]) + E(x->x,D[3]) )/3
#
E(x->x^2, d) #expectation of a transformation of a mixture

# truncated dist
d=truncated(LogNormal(μ,σ),0.0,5.4)
E(x->x, d)
E(x->x^2, d)
# truncated mixture
d=truncated(MixtureModel(D),0.0,5.4)
E(x->x, d)
E(x->x^2, d)

If you can agree, I can submit PR.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.