dumpmemory Goto Github PK
Type: User
Type: User
Pruning channels for model acceleration
A Python-level JIT compiler designed to make unmodified PyTorch programs faster.
A LARS implementation in PyTorch
Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.
异步内核就像风一样快!
Full-featured BitTorrent client package and utilities
⚡️ A magnet link scraper for streaming videos (movies, tv shows, anime, porn) along with subtitles.
Source code for NAACL 2021 paper "TR-BERT: Dynamic Token Reduction for Accelerating BERT Inference"
🎈 Updated daily! A list of popular BitTorrent Trackers! / 每天更新!全网热门 BT Tracker 列表!⭐++
A framework that automated cryptocurrency exchange with strategy
network quality & proxy service bench
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Easy and fast file sharing from the command-line.
Deploy optimized transformer based models in production
[CVPR 2021] Official PyTorch implementation for Transformer Interpretability Beyond Attention Visualization, a novel method to visualize classifications by Transformer based networks.
Recent Transformer-based CV and related works.
Official implementation of Long-Short Transformer in PyTorch.
Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Handwritten text recognition using transformers.
The code for two papers: Feedback Transformer and Expire-Span.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Model explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Train 🤗transformers with DeepSpeed: ZeRO-2, ZeRO-3
A collection of Models, Datasets, DataModules, Callbacks, Metrics, Losses and Loggers to better integrate pytorch-lightning with transformers.
Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper
Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
Generic sorted map for Go with red-black tree under the hood
Training Language Models with Memory Augmentation https://arxiv.org/abs/2205.12674
Development repository for the Triton language and compiler
Implementation of a Transformer, but completely in Triton
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.