GithubHelp home page GithubHelp logo

gpu running out of memory about release HOT 9 CLOSED

isayev avatar isayev commented on August 25, 2024
gpu running out of memory

from release.

Comments (9)

Mariewelt avatar Mariewelt commented on August 25, 2024

Hi @jamel-mes

My guess is that your GPU doesn't have enough memory to store the model. What is your GPU model and memory?

UPD:
you can check this by running nvidia-smi command in the terminal.

from release.

jamel-mes avatar jamel-mes commented on August 25, 2024

I have a 1080 with 8Gb

from release.

Mariewelt avatar Mariewelt commented on August 25, 2024

The model takes ~9Gbs, that's why you are having the out-of-memory error. You can reduce the number of parameters for the generator, which is defined in this block:

hidden_size = 1500
stack_width = 1500
stack_depth = 200

but in this case you will need to train the generative model from scratch, as we provide the pre-trained model only for the configuration above.

from release.

jamel-mes avatar jamel-mes commented on August 25, 2024

Great, thank you for your help!

from release.

Mariewelt avatar Mariewelt commented on August 25, 2024

@jamel-mes

I think, there is another thing you can try in order to squeeze into your 8Gbs of memory without changing the generator. Try reducing batch size in Policy gradient with experience replay and Policy gradient without experience replay steps from default 10 to 5:

for _ in range(n_policy_replay):
rewards.append(RL.policy_gradient_replay(gen_data, replay, threshold=threshold, n_batch=5))

for _ in range(n_policy):
rewards.append(RL.policy_gradient(gen_data, threshold=threshold, n_batch=5))

With this batch size on my machine the model took 6Gbs of memory.

from release.

jamel-mes avatar jamel-mes commented on August 25, 2024

decreasing batch size does the trick!

from release.

gmseabra avatar gmseabra commented on August 25, 2024

The model takes ~9Gbs, that's why you are having the out-of-memory error

Is there a way to estimate the memory need beforehand?

from release.

Mariewelt avatar Mariewelt commented on August 25, 2024

@gmseabra technically yes, the values are stores as float32. I would say that the easiest way to reduce memory usage is just decreasing batch size as we discussed below. In this scenario, you can keep using pretrained model and just try multiple batch sizes to see what fits into your GPU memory.

@jamel-mes

I think, there is another thing you can try in order to squeeze into your 8Gbs of memory without changing the generator. Try reducing batch size in Policy gradient with experience replay and Policy gradient without experience replay steps from default 10 to 5:

for _ in range(n_policy_replay):
rewards.append(RL.policy_gradient_replay(gen_data, replay, threshold=threshold, n_batch=5))

for _ in range(n_policy):
rewards.append(RL.policy_gradient(gen_data, threshold=threshold, n_batch=5))

With this batch size on my machine the model took 6Gbs of memory.

from release.

gmseabra avatar gmseabra commented on August 25, 2024

I was actually thinking about the possibility of checking the memory size and adjusting n_batch on the fly, depending on the GPU memory available...

But yes, reducing the batch size works for me too (on a GTX 1060, with 6GB mem).

from release.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.