During evaluation stage on development dataset, I am facing below error intermittently. Have you ever faced this issue and how did you resolve it?
Traceback (most recent call last):
File "coref.py", line 693, in <module>
trainer.train(150)
File "coref.py", line 459, in train
self.train_epoch(epoch, *args, **kwargs)
File "coref.py", line 490, in train_epoch
corefs_found, total_corefs, corefs_chosen = self.train_doc(doc)
File "coref.py", line 523, in train_doc
spans, probs = self.model(document)
File "/home/rupimanoj/anaconda3/envs/project/lib/python3.7/site-packages/torch/nn/modules/module.py", line 477, in __call__
result = self.forward(*input, **kwargs)
File "coref.py", line 424, in forward
states, embeds = self.encoder(doc)
File "/home/rupimanoj/anaconda3/envs/project/lib/python3.7/site-packages/torch/nn/modules/module.py", line 477, in __call__
result = self.forward(*input, **kwargs)
File "coref.py", line 206, in forward
packed, reorder = pack(embeds)
File "/home/rupimanoj/coref/coreference-resolution/src/utils.py", line 74, in pack
packed = pack_sequence(sorted_tensors)
File "/home/rupimanoj/anaconda3/envs/project/lib/python3.7/site-packages/torch/nn/utils/rnn.py", line 353, in pack_sequence
return pack_padded_sequence(pad_sequence(sequences), [v.size(0) for v in sequences])
File "/home/rupimanoj/anaconda3/envs/project/lib/python3.7/site-packages/torch/nn/utils/rnn.py", line 311, in pad_sequence
max_size = sequences[0].size()
IndexError: list index out of range