GithubHelp home page GithubHelp logo

Comments (3)

FriedaSmith avatar FriedaSmith commented on June 8, 2024

I used the Python command and also reported an error. The dataset path is as follows:

(MedNeXt) zw@T640:~/myData/Code/study/MedNeXt$ ls "/home/zw/myData/Code/study/nnUNet_raw_data_base/nnUNet_raw_data/Dataset001_Tr"
dataset.json  imagesTr  imagesTs  infers  labelsTr
(MedNeXt) zw@T640:~/myData/Code/study/MedNeXt$ python "/home/zw/myData/Code/study/MedNeXt/nnunet_mednext/experiment_planning/nnUNet_plan_and_preprocess.py" -t 1
Traceback (most recent call last):
  File "/home/zw/myData/Code/study/MedNeXt/nnunet_mednext/experiment_planning/nnUNet_plan_and_preprocess.py", line 170, in <module>
    main()
  File "/home/zw/myData/Code/study/MedNeXt/nnunet_mednext/experiment_planning/nnUNet_plan_and_preprocess.py", line 102, in main
    task_name = convert_id_to_task_name(i)
  File "/home/zw/myData/Code/study/MedNeXt/nnunet_mednext/utilities/task_name_id_conversion.py", line 51, in convert_id_to_task_name
    raise RuntimeError("Could not find a task with the ID %d. Make sure the requested task ID exists and that "
RuntimeError: Could not find a task with the ID 1. Make sure the requested task ID exists and that nnU-Net knows where raw and preprocessed data are located (see Documentation - Installation). Here are your currently defined folders:
nnUNet_preprocessed=/home/zw/myData/Code/study/nnUNet_preprocessed
RESULTS_FOLDER=/home/zw/myData/Code/study/nnUNet_trained_models
nnUNet_raw_data_base=/home/zw/myData/Code/study/nnUNet_raw_data_base
If something is not right, adapt your environemnt variables.

from mednext.

saikat-roy avatar saikat-roy commented on June 8, 2024

Hey @FriedaSmith. I haven't had the time to clean the preprocessing or nnunet training yet - or even release all the code for to do the preprocessing.

But maybe I can add something I notice in your code. Your code seems to be processed based on nnUNet (v2) format. This is what the main branch of nnUNet supports at the moment. My model was trained using nnUNet (v1) which you will find in a branch of the nnUNet repo (https://github.com/MIC-DKFZ/nnUNet/tree/nnunetv1).

You are free to use my model inside nnunet v2 if you have experience with replacing the model inside an nnUNetTrainer. Or wait a little more for me to clean/release code for data preparation and training as part of this repository.

from mednext.

saikat-roy avatar saikat-roy commented on June 8, 2024

There is code and instructions now to recreate the MICCAI2023 experiments in nnUNet(v1). Please reopen this issue if you have more questions.

from mednext.

Related Issues (16)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.