Topic: attention Goto Github
Some thing interesting about attention
Some thing interesting about attention
attention,Deep Xi: A deep learning approach to a priori SNR estimation implemented in TensorFlow 2/Keras. For speech enhancement and robust ASR.
User: anicolson
attention,This is a pytorch repository of YOLOv4, attentive YOLOv4 and mobilenet YOLOv4 with PASCAL VOC and COCO
User: argusswift
attention,Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
User: bentrevett
attention,中文实体关系抽取,pytorch,bilstm+attention
User: buppt
attention,Julia Implementation of Transformer models
User: chengchingwen
attention,🚀🚀🚀 A collection of some awesome public YOLO object detection series projects.
User: codingonion
attention,Machine learning, in numpy
User: ddbourgin
Home Page: https://numpy-ml.readthedocs.io/
attention,Generative Adversarial Transformers
User: dorarad
attention,Seq2Seq, Bert, Transformer, WaveNet for time series prediction.
User: evilpsycho
attention,Residual Attention Network for Image Classification
User: fwang91
attention,Long Range Arena for Benchmarking Efficient Transformers
Organization: google-research
attention,Scenic: A Jax Library for Computer Vision Research and Beyond
Organization: google-research
attention,My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
User: gordicaleksa
Home Page: https://youtube.com/c/TheAIEpiphany
attention,My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
User: gordicaleksa
Home Page: https://youtube.com/c/TheAIEpiphany
attention,Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Organization: graphdeeplearning
Home Page: https://arxiv.org/abs/2012.09699
attention,Natural Language Processing Tutorial for Deep Learning Researchers
User: graykode
attention,多标签文本分类,多标签分类,文本分类, multi-label, classifier, text classification, BERT, seq2seq,attention, multi-label-classification
User: hellonlp
attention,End-to-end ASR/LM implementation with PyTorch
User: hirofumi0810
attention,Code for our CVPR2021 paper coordinate attention
User: houqb
attention,Transformer: PyTorch Implementation of "Attention Is All You Need"
User: hyunwoongko
attention,Tensorflow implementation of attention mechanism for text classification tasks.
User: ilivans
attention,🔥🔥🔥 专注于YOLOv5,YOLOv7、YOLOv8、YOLOv9改进模型,Support to improve backbone, neck, head, loss, IoU, NMS and other modules🚀
User: iscyy
Home Page: https://github.com/iscyy/yoloair
attention,A PyTorch implementation of the Transformer model in "Attention is All You Need".
User: jadore801120
attention,Official Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation" - MICCAI 2021
User: jeya-maria-jose
attention,Bilinear attention networks for visual question answering
User: jnhwkim
attention,Voice activity detection (VAD) toolkit including DNN, bDNN, LSTM and ACAM based VAD. We also provide our directly recorded dataset.
User: jtkim-kaist
attention,A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
User: kaituoxu
attention,A Structured Self-attentive Sentence Embedding
User: kaushalshetty
attention,A Tensorflow implementation of Spatial Transformer Networks.
User: kevinzakka
attention,该仓库主要记录 NLP 算法工程师相关的顶会论文研读笔记
User: km1994
attention,Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
User: kyegomez
Home Page: https://discord.gg/qUtxnK2NMf
attention,🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Organization: labmlai
Home Page: https://nn.labml.ai
attention,Keras beit,caformer,CMT,CoAtNet,convnext,davit,dino,efficientdet,edgenext,efficientformer,efficientnet,eva,fasternet,fastervit,fastvit,flexivit,gcvit,ghostnet,gpvit,hornet,hiera,iformer,inceptionnext,lcnet,levit,maxvit,mobilevit,moganet,nat,nfnets,pvt,swin,tinynet,tinyvit,uniformer,volo,vanillanet,yolor,yolov7,yolov8,yolox,gpt2,llama2, alias kecam
User: leondgarse
attention,Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
User: lucidrains
attention,An implementation of Performer, a linear attention-based transformer, in Pytorch
User: lucidrains
attention,Gathers machine learning and Tensorflow deep learning models for NLP problems, 1.13 < Tensorflow < 2.0
Organization: mesolitica
attention,Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting (NeurIPS 2019)
User: mlpotter
attention,Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
User: morvanzhou
Home Page: https://mofanpy.com
attention,A bidirectional recurrent neural network model with attention mechanism for restoring missing punctuation in unsegmented text
User: ottokart
Home Page: http://bark.phon.ioc.ee/punctuator
attention,Replication of simple CV Projects including attention, classification, detection, keypoint detection, etc.
User: pprp
attention,list of efficient attention modules
User: separius
attention,Aspect Based Sentiment Analysis, PyTorch Implementations. 基于方面的情感分析,使用PyTorch实现。
User: songyouwei
attention,PyTorch implementation of some attentions for Deep Learning Researchers.
User: sooftware
attention,Implementation for the paper "Compositional Attention Networks for Machine Reasoning" (Hudson and Manning, ICLR 2018)
Organization: stanfordnlp
attention,Improving Convolutional Networks via Attention Transfer (ICLR 2017)
User: szagoruyko
Home Page: https://arxiv.org/abs/1612.03928
attention,Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Organization: the-ai-summer
Home Page: https://theaisummer.com/
attention,Implementation of papers for text classification task on DBpedia
User: tobiaslee
attention,🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
User: xmu-xiaoma666
attention,Implementation of MusicLM, a text to music model published by Google Research, with a few modifications.
User: zhvng
Home Page: https://arxiv.org/abs/2301.11325
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.