Hello, I'm trying to do BRISQUE training, however I'm not successful.
I tried to use options/train/train_general_iqa.yml as a base because there is no BRISQUE training example in the repository, but when trying to run it, I get this error "ValueError: optimizer got an empty parameter list", I would like to understand if I need to change something in the training settings so that the script can capture the optim_params.
My log:
IQA-PyTorch >>> python pyiqa/train.py -opt options/train/BRISQUE/train_brisque_livec.yml (1) (*main+254) 14:11:51
Disable distributed.
Path already exists. Rename it to /home/bilekig/Documents/Projects/IQA-PyTorch/experiments/debug_BRISQUE_NR_IQA_archived_20221215_141200
2022-12-15 14:12:00,387 INFO:
Version Information:
PyTorch: 1.13.0+cu117
TorchVision: 0.14.0+cu117
2022-12-15 14:12:00,387 INFO:
name: debug_BRISQUE_NR_IQA
model_type: GeneralIQAModel
num_gpu: 1
manual_seed: 123
save_final_results_path: ./experiments/BRISQUE_results.txt
datasets:[
train:[
name: livechallenge
type: LIVEChallengeDataset
dataroot_target: ./datasets/LIVEC/
meta_info_file: ./datasets/meta_info/meta_info_LIVEChallengeDataset.csv
split_file: ./datasets/train_split_info/livechallenge_seed123.pkl
split_index: 2
augment:[
hflip: True
random_crop: 448
]
img_range: 1
use_shuffle: True
num_worker_per_gpu: 4
batch_size_per_gpu: 8
dataset_enlarge_ratio: 1
prefetch_mode: None
phase: train
]
val:[
name: livechallenge
type: LIVEChallengeDataset
dataroot_target: ./datasets/LIVEC
meta_info_file: ./datasets/meta_info/meta_info_LIVEChallengeDataset.csv
split_file: ./datasets/train_split_info/livechallenge_seed123.pkl
split_index: 2
phase: val
]
]
network:[
type: BRISQUE
]
path:[
pretrain_network_g: None
strict_load_g: True
resume_state: None
experiments_root: /home/bilekig/Documents/Projects/IQA-PyTorch/experiments/debug_BRISQUE_NR_IQA
models: /home/bilekig/Documents/Projects/IQA-PyTorch/experiments/debug_BRISQUE_NR_IQA/models
training_states: /home/bilekig/Documents/Projects/IQA-PyTorch/experiments/debug_BRISQUE_NR_IQA/training_states
log: /home/bilekig/Documents/Projects/IQA-PyTorch/experiments/debug_BRISQUE_NR_IQA
visualization: /home/bilekig/Documents/Projects/IQA-PyTorch/experiments/debug_BRISQUE_NR_IQA/visualization
]
train:[
optim:[
type: SGD
lr: 0.001
momentum: 0.9
weight_decay: 0.0005
]
optim_finetune:[
type: Adam
lr: 1e-05
weight_decay: 0.0005
]
scheduler:[
type: MultiStepLR
milestones: [1000]
gamma: 1
]
scheduler_finetune:[
type: MultiStepLR
milestones: [1000]
gamma: 1
]
total_iter: 12000
finetune_start_iter: 6000
warmup_iter: -1
mos_loss_opt:[
type: MSELoss
loss_weight: 1.0
reduction: mean
]
]
val:[
val_freq: 7
save_img: False
pbar: True
key_metric: srcc
metrics:[
srcc:[
type: calculate_srcc
]
plcc:[
type: calculate_plcc
]
krcc:[
type: calculate_krcc
]
]
]
logger:[
print_freq: 1
save_checkpoint_freq: 7
save_latest_freq: 500.0
use_tb_logger: True
wandb:[
project: None
resume_id: None
]
]
dist_params:[
backend: nccl
port: 29500
]
dist: False
rank: 0
world_size: 1
auto_resume: False
is_train: True
root_path: /home/bilekig/Documents/Projects/IQA-PyTorch
2022-12-15 14:12:00,389 INFO: Dataset [LIVEChallengeDataset] - livechallenge is built.
2022-12-15 14:12:00,390 INFO: Training statistics:
Number of train images: 930
Dataset enlarge ratio: 1
Batch size per gpu: 8
World size (gpu number): 1
Require iter number per epoch: 117
Total epochs: 103; iters: 12000.
2022-12-15 14:12:00,392 INFO: Dataset [LIVEChallengeDataset] - livechallenge is built.
2022-12-15 14:12:00,392 INFO: Number of val images/folders in livechallenge: 232
2022-12-15 14:12:00,392 INFO: Network [BRISQUE] is created.
2022-12-15 14:12:00,392 INFO: Network: BRISQUE, with parameters: 0
2022-12-15 14:12:00,392 INFO: BRISQUE()
2022-12-15 14:12:00,392 INFO: Network [BRISQUE] is created.
2022-12-15 14:12:00,392 INFO: Loss [MSELoss] is created.
Traceback (most recent call last):
File "pyiqa/train.py", line 242, in
train_pipeline(root_path)
File "pyiqa/train.py", line 135, in train_pipeline
model = build_model(opt)
File "/home/bilekig/Documents/Projects/IQA-PyTorch/pyiqa/models/init.py", line 27, in build_model
model = MODEL_REGISTRY.get(opt['model_type'])(opt)
File "/home/bilekig/Documents/Projects/QA-PyTorch/pyiqa/models/general_iqa_model.py", line 33, in init
self.init_training_settings()
File "/home/bilekig/Documents/Projects/IQA-PyTorch/pyiqa/models/general_iqa_model.py", line 54, in init_training_settings
self.setup_optimizers()
File "/home/bilekig/Documents/Projects/IQA-PyTorch/pyiqa/models/general_iqa_model.py", line 69, in setup_optimizers
self.optimizer = self.get_optimizer(optim_type, optim_params, **train_opt['optim'])
File "/home/bilekig/Documents/Projects/IQA-PyTorch/pyiqa/models/base_model.py", line 122, in get_optimizer
optimizer = optim_class(params, lr, **kwargs)
File "/home/bilekig/.virtualenvs/test_pyiqa/lib/python3.8/site-packages/torch/optim/sgd.py", line 109, in init
super(SGD, self).init(params, defaults)
File "/home/bilekig/.virtualenvs/test_pyiqa/lib/python3.8/site-packages/torch/optim/optimizer.py", line 61, in init
raise ValueError("optimizer got an empty parameter list")
ValueError: optimizer got an empty parameter list