GithubHelp home page GithubHelp logo

lucidrains / agent-attention-pytorch Goto Github PK

View Code? Open in Web Editor NEW
73.0 3.0 1.0 526 KB

Implementation of Agent Attention in Pytorch

License: MIT License

Python 100.00%
artificial-intelligence attention-mechanisms deep-learning linear-attention

agent-attention-pytorch's Introduction

Agent Attention - Pytorch

Implementation of Agent Attention in Pytorch.

This work seems to be an elegant simplification of ISAB architecture from the Set Transformers paper (requires only one attention block rather than two). While ISAB works, I have found it to be a bit unstable, thus wondering if the simplification in this work resolves that issue.

This repository will add support for variable sequence lengths (masking) and post-softmax talking heads.

Appreciation

Install

$ pip install agent-attention-pytorch

Usage

import torch
from agent_attention_pytorch import AgentSelfAttention

attn = AgentSelfAttention(
    dim = 512,
    num_agent_tokens = 256,       # number of "agent" tokens
    dim_head = 64,                # attention head dimension
    heads = 8                     # number of heads
)

x = torch.randn(2, 65536, 512)
mask = torch.ones(2, 65536).bool()

out = attn(x, mask = mask)

assert out.shape == x.shape

For a full fledged linear transformer based on agent tokens, just import AgentTransformer

import torch
from agent_attention_pytorch import AgentTransformer

transformer = AgentTransformer(
    dim = 512,
    depth = 6,
    num_agent_tokens = 128,
    dim_head = 64,
    heads = 8
)

x = torch.randn(2, 65536, 512)
mask = torch.ones(2, 65536).bool()

out, agent_tokens = transformer(x, mask = mask)

# (2, 65536, 512), (2, 128, 512)
assert out.shape == x.shape

Citations

@inproceedings{Han2023AgentAO,
    title   = {Agent Attention: On the Integration of Softmax and Linear Attention},
    author  = {Dongchen Han and Tianzhu Ye and Yizeng Han and Zhuofan Xia and Shiji Song and Gao Huang},
    year    = {2023},
    url     = {https://api.semanticscholar.org/CorpusID:266210414}
}
@misc{shazeer2020talkingheads,
    title   = {Talking-Heads Attention}, 
    author  = {Noam Shazeer and Zhenzhong Lan and Youlong Cheng and Nan Ding and Le Hou},
    year    = {2020},
    eprint  = {2003.02436},
    archivePrefix = {arXiv},
    primaryClass = {cs.LG}
}
@article{Bondarenko2023QuantizableTR,
    title   = {Quantizable Transformers: Removing Outliers by Helping Attention Heads Do Nothing},
    author  = {Yelysei Bondarenko and Markus Nagel and Tijmen Blankevoort},
    journal = {ArXiv},
    year    = {2023},
    volume  = {abs/2306.12929},
    url     = {https://api.semanticscholar.org/CorpusID:259224568}
}
@article{Wang2022FoundationT,
    title   = {Foundation Transformers},
    author  = {Hongyu Wang and Shuming Ma and Shaohan Huang and Li Dong and Wenhui Wang and Zhiliang Peng and Yu Wu and Payal Bajaj and Saksham Singhal and Alon Benhaim and Barun Patra and Zhun Liu and Vishrav Chaudhary and Xia Song and Furu Wei},
    journal = {ArXiv},
    year    = {2022},
    volume  = {abs/2210.06423},
    url     = {https://api.semanticscholar.org/CorpusID:252846241}
}

agent-attention-pytorch's People

Contributors

lucidrains avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Forkers

thanhpham1987

agent-attention-pytorch's Issues

ISAB performance comparisons

Hello,

In the description of this repo, you mention that you hope agent attention is more stable than ISAB. Have you done any experiments to test whether or not that's the case?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.