GithubHelp home page GithubHelp logo

Comments (23)

kewlcoder avatar kewlcoder commented on September 28, 2024 1

Hi Maciej,
I am getting the same error - "ValueError: Not all estimated parameters are finite, your model may have diverged. Try decreasing the learning rate."

I have tried all these values for learning rate - [0.05, 0.025, 0.01, 0.001, 0.0001, 0.00001, 0.000001, 0.0000001] but still giving the same error.
Also, it only occurs when I add the item_features. It works fine with interactions + user_features data.

Please help !

from lightfm.

maciejkula avatar maciejkula commented on September 28, 2024

What are the values in the feature matrices? Have you tried normalizing them (for example to be between 0 and 1)?

from lightfm.

lesterlitch avatar lesterlitch commented on September 28, 2024

Thanks for the response!

I have yes, using the learn function - i.e. normalize(raw_member_features_matrix, axis=1, norm='l1')

Here are the first rows from the item and user feature csr matrices:

items_features[0,items_features[0].nonzero()[1]].todense()

matrix([[ 0.2189845 ,  0.18301879,  0.29823944,  0.19250721,  0.62113589,
          0.28761694,  0.4387733 ,  0.15228976,  0.32908452]], dtype=float32)

members_features[0,members_features[0].nonzero()[1]].todense()

matrix([[ 0.01500955,  0.00687691,  0.00488463,  0.03807613,  0.01714612,
          0.06524359,  0.01370857,  0.0203032 ,  0.0091073 ,  0.01899276,
          0.0170573 ,  0.03180252,  0.03951597,  0.03765749,  0.02067481,
          0.00863998,  0.03003284,  0.010614  ,  0.01699004,  0.02135187,
          0.02568188,  0.02606232,  0.01938645,  0.06161183,  0.0126634 ,
          0.01294042,  0.00720311,  0.030777  ,  0.01884086,  0.01178526,
          0.05592889,  0.02763181,  0.00907691,  0.01116292,  0.01343661,
          0.01717991,  0.01464464,  0.00726902,  0.01353738,  0.00541887,
          0.01728139,  0.01083446,  0.04138919,  0.01978991,  0.05642271,
          0.00835726]], dtype=float32)

from lightfm.

maciejkula avatar maciejkula commented on September 28, 2024

Just for my understanding: what does your indexing items_features[0,items_features[0].nonzero()[1]] accomplish?

Are you expecting these matrices to be dense?

from lightfm.

lesterlitch avatar lesterlitch commented on September 28, 2024

That's right, What I'm trying to do with that is go from items_features (a csr matrix of shape n_items, n_features) to a dense matrix of the values in the first row of that matrix, just to show the non-zero values are normalized between 0 and 1.

items_features[0] gives:

<1x2790 sparse matrix of type '<type 'numpy.float32'>'
with 9 stored elements in Compressed Sparse Row format>

from lightfm.

maciejkula avatar maciejkula commented on September 28, 2024

Cool, I understand.

Can you try reducing the learning rate and/or reducing the scale of the nonzero items even further?

from lightfm.

maciejkula avatar maciejkula commented on September 28, 2024

You could try turning off regularization as well to try to narrow the problem down.

from lightfm.

lesterlitch avatar lesterlitch commented on September 28, 2024

Cool, will try those and come back to you. Cheers.

from lightfm.

maciejkula avatar maciejkula commented on September 28, 2024

You're also using a lot of parallelism: this may cause problems if a lot of your users or items have the same features.

Let me know what you find!

from lightfm.

maciejkula avatar maciejkula commented on September 28, 2024

Any luck?

from lightfm.

lesterlitch avatar lesterlitch commented on September 28, 2024

Unfortunately not. I tried reducing the scale further (0-0.1), reducing the learning rate (several values), removing regularisation and running with only 4 threads. Strangely I can get a result with either only user or only item embeddings but not both. I'm not sure if this is a factor, but I have many more users than items (around 10x).

from lightfm.

maciejkula avatar maciejkula commented on September 28, 2024

Can you try the newest version (1.12)? It has numerical stability improvements which may resolve your problem.

from lightfm.

lesterlitch avatar lesterlitch commented on September 28, 2024

In the new version I get:

ValueError: Not all estimated parameters are finite, your model may have diverged. Try decreasing the learning rate.

Learning rate is 0.001 and have tried down to 0.00001. Have normalized features between 0-1, but also tried 0-0.1 and 0-0.01

My datasets look like:

items_features
(<39267x2790 sparse matrix of type '<type 'numpy.float32'>'
with 335801 stored elements in Compressed Sparse Row format>,

members_features
<305803x2790 sparse matrix of type '<type 'numpy.float32'>'
with 14772846 stored elements in Compressed Sparse Row format>,

interaction_matrix
<305803x39267 sparse matrix of type '<type 'numpy.float32'>'
with 3767965 stored elements in COOrdinate format>)

from lightfm.

maciejkula avatar maciejkula commented on September 28, 2024

Hmm. I may have to have a look at your code and data. Can you email me at [email protected]?

from lightfm.

maciejkula avatar maciejkula commented on September 28, 2024

It would be best if you could reproduce the problem using synthetic data (or a subset of your data that you don't mind sharing).

from lightfm.

maciejkula avatar maciejkula commented on September 28, 2024

Is this still a problem? I'd really like to help if it is!

from lightfm.

lesterlitch avatar lesterlitch commented on September 28, 2024

Just had a chance to revisit this.

When I recreate my matrices with random floats and ints, same value scale and same sparseness / shapes, I didn't encounter the same problem.

After investigation I discovered a bunch of empty rows in my member / item features. It seems the model can handle a few, but in my case there were 700 or so, and that was enough to push parameters to infinity.

Is this expected behavior?

from lightfm.

maciejkula avatar maciejkula commented on September 28, 2024

No, this shouldn't be the case. My first suspicion was that I don't zero the representation buffers, but they are zeroed.

from lightfm.

maciejkula avatar maciejkula commented on September 28, 2024

If you can construct a minimal test case that manifests this problem, I would be happy to have a look and solve this.

from lightfm.

lesterlitch avatar lesterlitch commented on September 28, 2024

Hi, sorry I was so slow on this. I've done a bunch more testing and found that when using very sparse factors for users and items, the learning rate needs to be very small to prevent divergence. This was an issue previously, possibly because of the numerical stability issues you mentioned? Anyway, after upgrading and retesting I can get the model to fit by adjusting the learning rate. Thanks!

from lightfm.

maciejkula avatar maciejkula commented on September 28, 2024

No worries, glad to see you found a solution.

from lightfm.

dwy904 avatar dwy904 commented on September 28, 2024

Why I got all zeros in both of the embedding matrices [user and item]?

from lightfm.

ness001 avatar ness001 commented on September 28, 2024

Me, too. Even if with extremely little learning rate, the error still got popped. I can't figure out why.

from lightfm.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.