GithubHelp home page GithubHelp logo

Hdf reading error about vamigaweb HOT 5 CLOSED

Vweber73 avatar Vweber73 commented on July 22, 2024
Hdf reading error

from vamigaweb.

Comments (5)

mithrendal avatar mithrendal commented on July 22, 2024

that is because I compiled it with fixed memory size for efficiency reasons and the fixed mem/stackmem is too small

I just changed mem settings to mem=1280MB and stack=256MB and now this hdf boots ..

pushed it to the uat

from vamigaweb.

Vweber73 avatar Vweber73 commented on July 22, 2024

Great, works fine now, many thanks!

from vamigaweb.

mithrendal avatar mithrendal commented on July 22, 2024

Hm in an older firefox version 78 it gives me this
Uncaught (in promise) RuntimeError: Aborted(CompileError: wasm validation error: at offset 24835: initial memory size too big). Build with -s ASSERTIONS=1 for more info.

I think older browser have some sort of a limit ... maybe 1GB ? Unfortunately vAmigaWeb wants for the the system.hdf which counts 209Mb in the provided link exactly 1042MB only a bit above the limit ...
BTW: why is this hdf soo big? Where is the upper limit ? I set the stacksize to 256Mb now...

Also strange that the stacksize has to be exactly the size of the hdf... maybe I did something wrong here some sort of passing the big file via the stack ?

extern "C" const char* wasm_loadFile(char* name, Uint8 *blob, long len)
{
...
 if (HDFFile::isCompatible(filename)) {
    printf("is hdf\n");
    wrapper->amiga->powerOff();
    HDFFile hdf{blob, len};  
    wrapper->amiga->configure(OPT_HDC_CONNECT,/*hd drive*/ 0, /*enable*/true);
    wrapper->amiga->hd0.init(hdf);
    wrapper->amiga->powerOn();
    return "";
  }

@dirkwhoffmann did I maybe screwed it up here in my code above so that it passes the big hdf via stack ?

I just pushed a new version to uat which starts at 512MB heapsize and automatically grows memory when needed. Reducing the memory footprint for smaller disks ... Unfortunately stacksize can not auto grow...

this wasm auto grow feature is said to have a performance impact ... I will test a little bit and report back here

from vamigaweb.

mithrendal avatar mithrendal commented on July 22, 2024

UPDATE: auto grow has no noticable impact on performance

Then I think the best is to stick with 512Mb initial memory and allow it to grow when more is needed... the stack unfortunately does not grow ... so I go with a fixed 256Mb stack which means HDFs bigger than 256 will run in the error again...

Maybe @dirkwhoffmann knows more on how to avoid the stack usage when inserting a hdf file. Then when avoiding is possible even larger hdfs would be possible and at the same time we can leave the memory consumption when using no hdf at a minimum of 256Mb...

from vamigaweb.

mithrendal avatar mithrendal commented on July 22, 2024

We reworked memory and avoided stack usage. Now mem footprint is down to 320 mb. Stack is reduced to 32Mb probably could be set even to a lower value now. But lets stick it to 32 mb for now.

Memory grows as needed no upper boundary anymore i.e. no boundaries on hdf size anymore

from vamigaweb.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.