GithubHelp home page GithubHelp logo

KFS smoothed variance bug (?) about kfas HOT 3 CLOSED

helske avatar helske commented on August 20, 2024
KFS smoothed variance bug (?)

from kfas.

Comments (3)

helske avatar helske commented on August 20, 2024

The variable V_mu is not the variance of observation y_t but the variance of mean i.e. in Gaussian case Var(Z_t alpha_t | Y) so there is no bug. But to be honest the documentation is bit poor in KFS regarding that, I have to rewrite it better.

Instead of manipulating the model after running SSModel, you can just add NAs to observations when constructing it via SSModel:

model_kfas_pred <- SSModel(c(Y, rep(NA, 12)) ~ -1 + SSMcustom(Z  = Z,
  T  = T,  R  = R,  Q  = Q,  a1 = a1,  P1 = P1),   H  = H)

out <- KFS(model_kfas_pred)
c(out$V_mu) + H 

But based on your code (dlmForecast) it looks like you are trying to make make predictions, so you don't actually need smoothed variances. You can get the prediction error variances from F component:

out$F[101:112]

And you can also use predict method for SSModel object (I am going to add similar method to KFS object soon too), for example:

predict(model_kfas, n.ahead = 12, se.fit = TRUE, interval = "prediction")
Time Series:
Start = 101 
End = 112 
Frequency = 1 
          fit       lwr      upr   se.fit
101 0.3848579 -2.704122 3.473838 1.218154
102 0.3463721 -3.160791 3.853535 1.483900
103 0.3117349 -3.500678 4.124148 1.668408
104 0.2805614 -3.762244 4.323367 1.804080
105 0.2525053 -3.967708 4.472718 1.906911
106 0.2272547 -4.131369 4.585878 1.986306
107 0.2045293 -4.263064 4.672123 2.048361
108 0.1840763 -4.369872 4.738024 2.097280
109 0.1656687 -4.457044 4.788381 2.136084
110 0.1491018 -4.528570 4.826773 2.167005
111 0.1341916 -4.587527 4.855911 2.191732
112 0.1207725 -4.636326 4.877871 2.211558

from kfas.

gragusa avatar gragusa commented on August 20, 2024

The documentation says the variance of the link function so I thought it
was the prediction variance.

I am implementing univariate filtering in Julia and I am using both KFAS
and dlm (but since are both gpl I can't look at the code to see what is
going on.

I suggestion: in many applications is useful to have the variance
covariance of the prediction. It should be easy enough to make kfs returns
F the matrix and not only the diagonal.

I am closing this.
On Fri, 11 Dec 2015 at 19:37, Jouni Helske [email protected] wrote:

The variable V_mu is not the variance of observation y_t but the variance
of mean i.e. in Gaussian case Var(Z_t alpha_t | Y) so there is no bug.
But to be honest the documentation is bit poor in KFS regarding that, I
have to rewrite it better.

Instead of manipulating the model after running SSModel, you can just add
NAs to observations when constructing it via SSModel:

model_kfas_pred <- SSModel(c(Y, rep(NA, 12)) ~ -1 + SSMcustom(Z = Z,
T = T, R = R, Q = Q, a1 = a1, P1 = P1), H = H)

out <- KFS(model_kfas_pred)
c(out$V_mu) + H

But based on your code (dlmForecast) it looks like you are trying to make
make predictions, so you don't actually need smoothed variances. You can
get the prediction error variances from F component:

out$F[101:112]

And you can also use predict method for SSModel object (I am going to add
similar method to KFS object soon too), for example:

predict(model_kfas, n.ahead = 12, se.fit = TRUE, interval = "prediction")
Time Series:
Start = 101
End = 112
Frequency = 1
fit lwr upr se.fit
101 0.3848579 -2.704122 3.473838 1.218154
102 0.3463721 -3.160791 3.853535 1.483900
103 0.3117349 -3.500678 4.124148 1.668408
104 0.2805614 -3.762244 4.323367 1.804080
105 0.2525053 -3.967708 4.472718 1.906911
106 0.2272547 -4.131369 4.585878 1.986306
107 0.2045293 -4.263064 4.672123 2.048361
108 0.1840763 -4.369872 4.738024 2.097280
109 0.1656687 -4.457044 4.788381 2.136084
110 0.1491018 -4.528570 4.826773 2.167005
111 0.1341916 -4.587527 4.855911 2.191732
112 0.1207725 -4.636326 4.877871 2.211558


Reply to this email directly or view it on GitHub
#9 (comment).

from kfas.

helske avatar helske commented on August 20, 2024

Yes that is actually already possible with the github version of the package, there is now function mvInnovations which computes the usual multivariate prediction errors and their variances.

from kfas.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.