Topic: bart Goto Github
Some thing interesting about bart
Some thing interesting about bart
bart,Abstractive and Extractive Text summarization using Transformers.
User: aj-naik
bart,Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper
Organization: amazon-science
bart,Code for Findings of EMNLP 2022 short paper "CDGP: Automatic Cloze Distractor Generation based on Pre-trained Language Model".
User: andychiangsh
Home Page: https://cdgp-demo.nlpnchu.org/
bart,Multilingual/multidomain question generation datasets, models, and python library for question generation.
User: asahi417
Home Page: https://www.autoqg.net
bart,JAX implementation of the bart-base model
User: ayaka14732
Home Page: https://arxiv.org/abs/1910.13461
bart,An English-to-Cantonese machine translation model
User: ayaka14732
bart,TrAVis: Visualise BERT attention in your browser
User: ayaka14732
Home Page: https://ayaka14732.github.io/TrAVis/
bart,A project improves the quality and accuracy of the Vietnamese language.
User: bmd1905
bart,LightSeq: A High Performance Library for Sequence Processing and Generation
Organization: bytedance
bart,Script to pre-train hugginface transformers BART with Tensorflow 2
User: cosmoquester
bart,Implementation and helper scripts for the BART-TL model - https://www.aclweb.org/anthology/2021.eacl-main.121/
User: cristianviorelpopa
bart,Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Organization: dbiir
Home Page: https://github.com/dbiir/UER-py/wiki
bart,MinT: Minimal Transformer Library and Tutorials
User: dpressel
bart,A general purpose Gaussian process regression module
User: gattocrucco
Home Page: https://gattocrucco.github.io/lsqfitgp/docs
bart,Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
User: guillaume-be
Home Page: https://docs.rs/crate/rust-bert
bart,A tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
User: hhousen
Home Page: https://haydenhousen.com/projects/docsum/
bart,Fine-tuning BART on COVID Dialogue Dataset
User: huangxt39
bart,[KBS] PCAE: A Framework of Plug-in Conditional Auto-Encoder for Controllable Text Generation PyTorch Implementation
User: imkett
bart,The first-ever vast natural language generation benchmark for Indonesian, Sundanese, and Javanese. We provide multiple downstream tasks, pre-trained IndoGPT and IndoBART models, and a starter code! (EMNLP 2021)
Organization: indonlp
bart,Automated Categorization: Utilizing the power of neural networks, this project offers an automated solution to categorize bank descriptions, reducing manual effort and enhancing efficiency while maintaining privacy.
User: j-convey
bart,About Code for the paper "NASH: A Simple Unified Framework of Structured Pruning for Accelerating Encoder-Decoder Language Models" (EMNLP 2023 Findings)
User: jongwooko
Home Page: https://arxiv.org/abs/2310.10054
bart,Code for EMNLP 2021 paper "Topic-Aware Contrastive Learning for Abstractive Dialogue Summarization"
User: junpliu
bart,Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer (ACL 2021)
User: laihuiyuan
Home Page: https://arxiv.org/abs/2105.06947
bart,code for AAAI2022 paper "Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot Sentiment Classification"
User: mikewangwzhl
bart,Extract Molecular SMILES embeddings from language models pre-trained with various objectives architectures.
Organization: moleculetransformers
bart,Cybertron: the home planet of the Transformers in Go
Organization: nlpodyssey
bart,Self-contained Machine Learning and Natural Language Processing library in Go
Organization: nlpodyssey
bart,Abstractive text summarization by fine-tuning seq2seq models.
User: nsi319
bart,Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
User: p208p2002
Home Page: https://huggingface.co/p208p2002/bart-squad-qg-hl
bart,Source codes and dataset of Call for Customized Conversation: Customized Conversation Grounding Persona and Knowledge
User: pkchat-focus
bart,An application that summarizes and classifies long text using zero-shot transfer-learning hosted on Huggingface spaces.
User: pleonova
Home Page: https://huggingface.co/spaces/pleonova/multi-label-summary-text
bart,Point-and-click bartCause analysis and causal inference education
Organization: priism-center
Home Page: https://apsta.shinyapps.io/thinkCausal/
bart,Code for our Paper, 'Summaformers @ LaySumm 20, LongSumm 20' at EMNLP 2020, Scholarly Document Processing Workshop
User: sayarghoshroy
bart,KorQuAD Korean domain Question Generation module based on KoBART
User: seoneun
bart,TextGen: Implementation of Text Generation models, include LLaMA, BLOOM, GPT2, BART, T5, SongNet and so on. 文本生成模型,实现了包括LLaMA,ChatGLM,BLOOM,GPT2,Seq2Seq,BART,T5,UDA等模型的训练和预测,开箱即用。
User: shibing624
bart,Pytorch implementation of baseline models of KQA Pro, a large-scale dataset of complex question answering over knowledge base.
User: shijx12
Home Page: http://thukeg.gitee.io/kqa-pro/
bart,Comparing state of the art models for text summary generation
User: singhsidhukuldeep
bart,Build and train state-of-the-art natural language processing models using BERT
User: sudharsan13296
Home Page: https://www.amazon.com/gp/product/B08LLDF377/ref=dbs_a_def_rwt_bibl_vppi_i5
bart,使用GENIUS文本生成模型训练自己的数据集。
User: taishan1994
bart,NAACL 2021 - Progressive Generation of Long Text
User: tanyuqian
bart,Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Organization: tencent
Home Page: https://github.com/Tencent/TencentPretrain/wiki
bart,Codes for our paper "JointGT: Graph-Text Joint Representation Learning for Text Generation from Knowledge Graphs" (ACL 2021 Findings)
Organization: thu-coai
bart,The French summarization dataset introduced in "BARThez: a Skilled Pretrained French Sequence-to-Sequence Model".
User: tixierae
Home Page: https://arxiv.org/pdf/2010.12321.pdf
bart,Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper
User: varunkumar-dev
bart,BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese (INTERSPEECH 2022)
Organization: vinairesearch
bart,Official implementation of the paper "IteraTeR: Understanding Iterative Revision from Human-Written Text" (ACL 2022)
User: vipulraheja
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.