rl4m / pcrlv2 Goto Github PK
View Code? Open in Web Editor NEWAn official implementation of PCRLv2 (pre-training and fine-tuning code are included).
License: MIT License
An official implementation of PCRLv2 (pre-training and fine-tuning code are included).
License: MIT License
Hello author, thank you for this great article. I didn't find the index to evaluate the accuracy in the process of running the code. Did I miss it? And my cos_loss and local_loss are both negative numbers. Is this normal? How can I evaluate the results of my model training and check the indicators in my paper?
Dear author, do you still have the weight values of MG [49], TransVw [16], Cube++ [35], 3D-CPC [34] of brats in Table 7?
If so, can you send it to me via email?
My email address is [email protected]. I look forward to hearing from you.
/home/lcl/anaconda3/envs/PCRLv2/bin/python3.9 /home/lcl/PCRLv2/main.py --data path_to_processedLUNA --model pcrlv2 --b 32 --epochs 240 --lr 1e-3 --output saved_dir --n luna --d 3 --gpus 0,1 --ratio 1.0 --amp
Namespace(data='path_to_processedLUNA', model='pcrlv2', phase='pretask', b=32, epochs=240, lr=0.001, output='saved_dir', n='luna', d=3, workers=4, gpus='0,1', ratio=1.0, momentum=0.9, weight_decay=0.0001, seed=42, amp=True)
pcrlv2_luna_pretask
using the reverse_aug pretrain on luna
total train images 9968, valid images 4240
Traceback (most recent call last):
File "/home/lcl/PCRLv2/main.py", line 50, in
train_pcrlv2_3d(args, data_loader)
File "/home/lcl/PCRLv2/train_3d.py", line 53, in train_pcrlv2_3d
model, optimizer = amp.initialize(model, optimizer, opt_level='O1')
NameError: name 'amp' is not defined
Dear author, is there no code for calculating dice coefficients in this code?
Namespace(data='/home/lcl/PCRLv2/dataset/BraTS_2018/MICCAI_BraTS_2018_Data_Training', model='pcrlv2', phase='finetune', b=4, epochs=100, lr=0.0001, output='./brats_finetune_weight', n='brats', d=3, workers=4, gpus='0,1', ratio=1.0, momentum=0.9, weight_decay='./pretrained_weight/simance_multi_crop_luna_pretask_1.0_240.pt', seed=42, amp=False)
pcrlv2_brats_finetune
Traceback (most recent call last):
File "/home/lcl/PCRLv2/main.py", line 46, in
data_loader = get_dataloader(args)
File "/home/lcl/PCRLv2/main.py", line 17, in get_dataloader
dataloader = getattr(generator, loader_name)()
AttributeError: 'DataGenerator' object has no attribute 'pcrlv2_brats_finetune'
Process finished with exit code 1
Thank you for the excellent work and for releasing it.
I wonder how the train txt is generated for LUNA, because in the code you already divide the train/val set by different "subset".
In addition, I notice some hyperparameters are different between the paper and the code (e.g., batch size for 2D, weight decay), and the "patience" parameter in finetuning code is missing. Could you confirm their values? Thanks!
The files needed to create the dataset for finetuning on LUNA are missing (luna_finetune_test.txt and luna_finetune_val.txt). Is it possible to upload them?
pcrlv2_brats_finetune
Traceback (most recent call last):
File "/home/lcl/PCRLv2/main.py", line 46, in
data_loader = get_dataloader(args)
File "/home/lcl/PCRLv2/main.py", line 17, in get_dataloader
dataloader = getattr(generator, loader_name)()
AttributeError: 'DataGenerator' object has no attribute 'pcrlv2_brats_finetune'
There is code for finetuning (train+val) on BraTS, but there is no testing code so as to evaluate the model on the test set using the dice similarity as done in the paper. Could the authors please provide this?
Thanks for your nice work!
How can we conduct semi-supervised finetuning on the LUNA dataset?
I get the following error when running the command:
python main.py --data /mnt/5C5C25FB5C25D116/data/BraTS2018 --model pcrlv2 --phase finetune --lr 1e-4 --output ./brats_finetune_weight --weight ./pretrained_weight/simance_multi_crop_luna_pretask_1.0_240.pt --n brats --d 3 --gpus 0,1,2,3 --b 4 --ratio 1.0
感谢您的出色工作! 我们如何对 LUNA 数据集进行半监督微调?以及微调后的评价指标如何获得?
Dear authors:
This is great work, which makes me learn much.
However, I find some differences between the paper and the source code.
Specifically, in your paper, you said the initial rate is 1e-2, while in the code, the learning rate is 1e-3.
Which is right.
Thank you in advance
Thanks for your impressive work!
After implment the 3D pretrain on LUNA dataset, would you mind provide the LUNA classification tasks code for evaluation?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.