lucidrains Goto Github PK
Name: Phil Wang
Type: User
Bio: Working with Attention. It's all we need
Location: San Francisco
Blog: lucidrains.github.io
Name: Phil Wang
Type: User
Bio: Working with Attention. It's all we need
Location: San Francisco
Blog: lucidrains.github.io
An implementation of Phasic Policy Gradient, a proposed improvement of Proximal Policy Gradients, in Pytorch
Implementation of Phenaki Video, which uses Mask GIT to produce text guided videos of up to 2 minutes in length, in Pytorch
Implementation of π-GAN, for 3d-aware image synthesis, in Pytorch
Implementation of Pixel-level Contrastive Learning, proposed in the paper "Propagate Yourself", in Pytorch
Implementation of the Point Transformer layer, in Pytorch
Implementation of a Transformer that Ponders, using the scheme from the PonderNet paper
Standalone Product Key Memory module in Pytorch - for augmenting Transformer models
Implementation and replication of ProGen, Language Modeling for Protein Generation, in Jax
Implementation of ProteinBERT in Pytorch
Tensors and Dynamic neural networks in Python with strong GPU acceleration
torch-optimizer -- collection of optimizers for Pytorch
RaveForce - An OpenAI Gym style toolkit for music generation experiments.
Deep Recurrent Neural Networks and LSTMs in Javascript. More generally also arbitrary expression graphs with automatic differentiation.
Reformer, the efficient Transformer, in Pytorch
Implementation of a Transformer using ReLA (Rectified Linear Attention) from https://arxiv.org/abs/2104.07012
Implementation of the Remixer Block from the Remixer paper, in Pytorch
Implementation of ResMLP, an all MLP solution to image classification, in Pytorch
The correct way to resize images or tensors. For Numpy or Pytorch (differentiable).
Implementation of Retrieval-Augmented Denoising Diffusion Probabilistic Models in Pytorch
Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Framework for creating (partially) reversible neural networks with PyTorch
Replication attempt for the Protein Folding Model described in https://www.biorxiv.org/content/10.1101/2021.08.02.454840v1
RITA is a family of autoregressive protein models, developed by LightOn in collaboration with the OATML group at Oxford and the Debora Marks Lab at Harvard.
Implementation of Rotary Embeddings, from the Roformer paper, in Pytorch
Fully featured implementation of Routing Transformer
Implementation of RQ Transformer, proposed in the paper "Autoregressive Image Generation using Residual Quantization"
Implementation of Scattering Compositional Learner in Pytorch
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Implementation of Segformer, Attention + MLP neural network for segmentation, in Pytorch
Sinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.