An easy-to-use framework for BERT models, with trainers, various NLP tasks and detailed annonations.
You can implement various NLP task conveniently with many functions such as adv training, fp16, gradient clip, r-drop, early stop, etc.
pip install fantasybert
The lastest verion is 0.1.3
semantic text similarity
The datasets are downloaded on CLUE.
TNEWS | IFLYTEK | AFQMC | CMNLI | CMRC | CLUENER | |
---|---|---|---|---|---|---|
BERT_pub | 56.84 | 59.43 | 74.07 | 80.42 | 73.95 | 78.82 |
BERT_our | 56.63 | .. | 72.75 | .. | .. | 79.669 |
For detailed results, please see here.
BERT_pub denotes the public results by CLUE, BERT_our denotes the results by FantasyBert. The pretrained model of clue and fantasybert is chinese-bert-wwm-ext model, and the predictions of test datasets is evaluated on CLUE, more results of clue public is here.
Some code are edited on transformers(tokenizaiton) and fastnlp(trainer), I simplified the code and added some new functions.
The part of models directly uses the pretrain model in transformers, I tried write bert models in bert4pytorch, but due to time limit and lack of ability, it failed to achieve the quality and efficiency of transformers as did.
This project is not for commerical use and is only for private use.