rgqa's People
rgqa's Issues
How is the conflict with transformers resolved?
The requirements for sentence-transformer require "We recommend Python 3.6 or higher, PyTorch 1.6.0 or higher and transformers v4.6.0 or higher." . How is the conflict with transformers resolved, please?
excuse me , i want to know why we need to add three new tokens: [demo], [tgr] and [sep_arg]. what will happen if I remove them?
Type error:prepare_inputs_for_generation() missing 1 required positional argument: 'past'
when in test,one error:
File "train.py", line 220, in
main()
File "train.py", line 211, in main
trainer.test(model, datamodule=dm) #also loads training dataloader
File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 718, in test
results = self.__test_given_model(model, test_dataloaders)
File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 783, in __test_given_model
results = self.fit(model)
File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 444, in fit
results = self.accelerator_backend.train()
File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/accelerators/gpu_accelerator.py", line 63, in train
results = self.train_or_test()
File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/accelerators/accelerator.py", line 72, in train_or_test
results = self.trainer.run_test()
File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 627, in run_test
eval_loop_results, _ = self.run_evaluation(test_mode=True)
File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 578, in run_evaluation
output = self.evaluation_loop.evaluation_step(test_mode, batch, batch_idx, dataloader_idx)
File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/trainer/evaluation_loop.py", line 169, in evaluation_step
output = self.trainer.accelerator_backend.test_step(args)
File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/accelerators/gpu_accelerator.py", line 103, in test_step
output = self.__test_step(args)
File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/accelerators/gpu_accelerator.py", line 111, in __test_step
output = self.trainer.model.test_step(*args)
File "/root/autodl-tmp/RGQA-master/src/genie/model.py", line 114, in test_step
sample_output = self.model.generate(batch['input_token_ids'], do_sample=False,
File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 26, in decorate_context
return func(*args, **kwargs)
File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/transformers/generation_utils.py", line 970, in generate
return self.greedy_search(
File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/transformers/generation_utils.py", line 1270, in greedy_search
model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs)
TypeError: prepare_inputs_for_generation() missing 1 required positional argument: 'past'
Is it the transformer version? Requirements requires 3.1.0, but the Sentence Transformer must be at least 4.6.0 What is the problem?
Excuse me , I hava some questions about demonstration store .
I'm recently interested in few-shot EAE task , and thanks for your contribution . However , in the paper , you just described a little of demonstration store . And I read the source code , take ACE few-shot EAE as example , you take sampled ACE dataset as train file and take full ACE dataset as demonstration store ? So if I don't hava a full demonstration store in same domain , that means I have to take the other domain dataset , such as Wiki,RAMS , as demonstration store to run the transfer learning on few-shot EAE task? And I wonder what the meaning of "ir" , the description is "if augment ir result" .
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.