lucidrains Goto Github PK
Name: Phil Wang
Type: User
Bio: Working with Attention. It's all we need
Location: San Francisco
Blog: lucidrains.github.io
Name: Phil Wang
Type: User
Bio: Working with Attention. It's all we need
Location: San Francisco
Blog: lucidrains.github.io
Deep learning operations reinvented (for pytorch, tensorflow, jax and others)
Implementation of some personal helper functions for Einops, my most favorite tensor manipulation library ❤️
A simple and working implementation of Electra, the fastest way to pretrain language models from scratch, in Pytorch
A simple way to keep track of an Exponential Moving Average (EMA) version of your pytorch model
Implementation of E(n)-Transformer, which incorporates attention mechanisms into Welling's E(n)-Equivariant Graph Neural Network
Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
The full training script for Enformer - Tensorflow Sonnet
Implementation of Denoising Diffusion for protein design, but using the new Equiformer (successor to SE3 Transformers) with some additional improvements
Implementation of the Equiformer, SE3/E3 equivariant attention network that reaches new SOTA, and adopted for use by EquiFold for protein folding
Callable PyTrees and filtered JIT/grad transformations => neural networks in JAX.
Usable implementation of Emerging Symbol Binding Network (ESBN), in Pytorch
An attempt to merge ESBN with Transformers, to endow Transformers with the ability to emergently bind symbols
Implementation of ETSformer, state of the art time-series Transformer, in Pytorch
Implementation of Fast Transformer in Pytorch
Implementation of Feedback Transformer in Pytorch
FFCV: Fast Forward Computer Vision (and other ML workloads!)
Implementation of 🦩 Flamingo, state-of-the-art few-shot visual question answering attention net out of Deepmind, in Pytorch
Fast and memory-efficient exact attention
Implementation of Flash Attention in Jax
Implementation of fused cosine similarity attention in the same style as Flash Attention
Implementation of the Transformer variant proposed in "Transformer Quality in Linear Time"
Implementation of the video diffusion model and training scheme presented in the paper, Flexible Diffusion Modeling of Long Videos, in Pytorch
GPT, but made only out of MLPs
Implementation of gMLP, an all-MLP replacement for Transformers, in Pytorch
Implementation of Gated State Spaces, from the paper "Long Range Language Modeling via Gated State Spaces", in Pytorch
Implementation of Geometric Vector Perceptron, a simple circuit for 3d rotation equivariance for learning over large biomolecules, in Pytorch. Idea proposed and accepted at ICLR 2021
Implementation of GigaGAN, new SOTA GAN out of Adobe. Culmination of nearly a decade of research into GANs
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
An attempt at the implementation of Glom, Geoffrey Hinton's new idea that integrates concepts from neural fields, top-down-bottom-up processing, and attention (consensus between columns), for emergent part-whole heirarchies from data
Implementation of Graph Transformer in Pytorch, for potential use in replicating Alphafold2
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.