GithubHelp home page GithubHelp logo

Comments (2)

L-M-Sherlock avatar L-M-Sherlock commented on June 25, 2024

But the default difficulty deceased monotonically in your case. I will test your collection tomorrow.

from fsrs4anki.

kuroahna avatar kuroahna commented on June 25, 2024

Hm actually, you might be right. A large majority of my cards have ~250% ease

image

image

And looking at the interval history if I keep pressing Good:

first rating: 3
rating history: 3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3
interval history: 0,4,10,25,56,119,240,460,845,1493,2551,4227,6812,10707,16451,24757
difficulty history: 0,1.2,1.2,1.2,1.2,1.2,1.2,1.2,1.2,1.2,1.2,1.2,1.2,1.2,1.2,1.2

The change between each interval is:

0 -> 4 = N/A
4 -> 10 = 10 / 4 = 2.5 = 250%
10 -> 25 = 25 / 10 = 2.5 = 250%
25 -> 56 = 56/ 25 = 2.24 = 224%
56 -> 119 = 119 / 56 = 2.125 = 212.5%
119 -> 240 = 240 / 119 = 2.017 = 201.7%
...

This seems correct since most of my cards have 250% and I have about ~88% retention rate for my young cards. The FSRS4anki algorithm starts with 250% change for the interval and then it slowly decreases the change in interval as I keep pressing good.

In version FSRS4Anki v1.4.3 Optimizer however, I got the results:

let defaultDifficulty = 3.835;
let defaultStability = 3.2191;
let difficultyDecay = -0.9131;
let stabilityDecay = -0.0872;
let retrievabilityFactor = 1.2571;
let increaseFactor = 3.2025;
let lapsesBase = -0.0493;

stability, difficulty, lapses
     3.22        3.83       0
     5.77        3.84       0
    10.62        3.84       0
    19.06        3.83       0
    32.91        3.83       0
    55.86        3.83       0
    93.04        3.83       0
   152.10        3.83       0
   244.58        3.83       0
   387.58        3.83       0
   605.15        3.83       0
   931.47        3.83       0
  1415.08        3.83       0
  2123.76        3.83       0
  3150.51        3.83       0
0,3,6,11,19,33,56,93,152,245,388,605,931,1415,2124,3151

The change between each interval here is:

0 -> 3 = N/A
3 -> 6 = 6 / 3 = 2 = 200%
6 -> 11 = 11 /  6 = 1.83 = 183%
11 -> 19 = 19 / 11 = 1.73 = 173%
19 -> 33 = 33 / 19 = 1.74 = 174%
33 -> 56 = 56 / 33 = 1.70 = 170%
...

With these parameters, it starts with a change in interval of 200%, not 250%. It does seem like these new parameters are correct. I guess the larger learning rate allowed it to converge faster.

from fsrs4anki.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.