GithubHelp home page GithubHelp logo

hookk / awesome-attention-based-gnns Goto Github PK

View Code? Open in Web Editor NEW

This project forked from sunxiaobei/awesome-attention-based-gnns

0.0 0.0 0.0 12 KB

A collection of resources on attention-based graph neural networks

awesome-attention-based-gnns's Introduction

Awesome Attention-based GNNs

  • A collection of resources on attention-based graph neural networks.

  • Welcome to submit a pull request to add more awesome papers.

- [x] [journal] [model] paper_title [[paper]](link) [[code]](link)

Table of Contents

Survey

  • [TKDD2019] [survey] Attention Models in Graphs: A Survey [paper]

GRANs

GRU Attention

  • [ICLR2021] [GR-GAT] Gated Relational Graph Attention Networks [paper]
  • [T-SP2020] [GRNN] Gated Graph Recurrent Neural Networks [paper]
  • [NeurIPS2019] [GRAN] Efficient Graph Generation with Graph Recurrent Attention Networks [paper] [code]
  • [UAI2018] [GaAN] GaAN: Gated Attention Networks for Learning on Large and Spatiotemporal Graphs [paper] [code]
  • [ICML2018] [GraphRNN] Graphrnn: Generating realistic graphs with deep auto-regressive models [paper] [code]
  • [ICLR2016] [GGNN] Gated Graph Sequence Neural Networks [paper] [code]

LSTM Attention

  • [AAAI2019] [GeniePath] GeniePath: Graph Neural Networks with Adaptive Receptive Paths [paper]
  • [ICML2018] [JK-Net] Representation Learning on Graphs with Jumping Knowledge Networks [paper]
  • [KDD2018] [GAM] Graph Classification using Structural Attention [paper] [code]
  • [NeurIPS2017] [GraphSage] Inductive representation learning on large graphs [paper] [code]

GATs

  • [ICLR2018] [GAT] Graph Attention Networks [paper] [code]
  • [NeurIPS2019] [C-GAT] Improving Graph Attention Networks with Large Margin-based Constraints [paper]
  • [ICLR2022] [GATv2] How Attentive are Graph Attention Networks? [paper] [code]
  • [NeurIPS2021] [DMP] Diverse Message Passing for Attribute with Heterophily [paper]
  • [KDD2021] [Simple-HGN] Are we really making much progress? Revisiting, benchmarking and refining heterogeneous graph neural networks [paper] [code]
  • [ICLR2022] [PPRGAT] Personalized PageRank Meets Graph Attention Networks [paper]
  • [ICLR2021] [SuperGAT] How to Find Your Friendly Neighborhood: Graph Attention Design with Self-Supervision [paper] [code]
  • [KDD2019] [GANet] Graph Representation Learning via Hard and Channel-Wise Attention Networks [paper]
  • [NeurIPS2021] [CAT] Learning Conjoint Attentions for Graph Neural Nets [paper] [code]
  • [TransNNLS2021] [MSNA] Neighborhood Attention Networks With Adversarial Learning for Link Prediction [paper]
  • [arXiv2018] [GGAT] Deeply learning molecular structure-property relationships using attention- and gate-augmented graph convolutional network [paper]
  • [NeurIPS2019] [HGCN] Hyperbolic Graph Convolutional Neural Networks [paper]
  • [TBD2021] [HAT] Hyperbolic graph attention network [paper]
  • [IJCAI2020] [Hype-HAN] Hype-HAN: Hyperbolic Hierarchical Attention Network for Semantic Embedding [paper]
  • [ICLR2019] [] Hyperbolic Attention Networks [paper]
  • [ICLR2018] [AGNN] Attention-based Graph Neural Network for Semi-supervised Learning [paper]
  • [KDD2018] [HTNE] Embedding Temporal Network via Neighborhood Formation [paper] [code]
  • [CVPR2020] [DKGAT] Distilling Knowledge from Graph Convolutional Networks [paper] [code]
  • [ICCV2021] [DAGL] Dynamic Attentive Graph Learning for Image Restoration [paper] [code]
  • [AAAI2021] [SCGA] Structured Co-reference Graph Attention for Video-grounded Dialogue [paper]
  • [AAAI2021] [Co-GAT] Co-GAT: A Co-Interactive Graph Attention Network for Joint Dialog Act Recognition and Sentiment Classification [paper]
  • [ACL2020] [ED-GAT] Entity-Aware Dependency-Based Deep Graph Attention Network for Comparative Preference Classification [paper]
  • [EMNLP2019] [TD-GAT] Syntax-Aware Aspect Level Sentiment Classification with Graph Attention Networks [paper]
  • [WWW2020] [GATON] Graph Attention Topic Modeling Network [paper]
  • [KDD2017] [GRAM] GRAM: Graph-based attention model for healthcare representation learning [paper] [code]
  • [IJCAI2019] [SPAGAN] SPAGAN: Shortest Path Graph Attention Network [paper]
  • [PKDD2021] [PaGNN] Inductive Link Prediction with Interactive Structure Learning on Attributed Graph [paper]
  • [arXiv2019] [DeepLinker] Link Prediction via Graph Attention Network [paper]
  • [IJCNN2020] [CGAT] Heterogeneous Information Network Embedding with Convolutional Graph Attention Networks [paper]
  • [ICLR2020] [ADSF] Adaptive Structural Fingerprints for Graph Attention Networks [paper]
  • [KDD2021] [T-GAP] Learning to Walk across Time for Interpretable Temporal Knowledge Graph Completion [paper] [code]
  • [NeurIPS2018] [MAF] Modeling Attention Flow on Graphs [paper]
  • [IJCAI2021] [MAGNA] Multi-hop Attention Graph Neural Network [paper] [code]
  • [AAAI2020] [SNEA] Learning Signed Network Embedding via Graph Attention [paper]
  • [ICANN2019] [SiGAT] Signed Graph Attention Networks [paper]
  • [ICLR2019] [RGAT] Relational Graph Attention Networks [paper] [code]
  • [arXiv2018] [EAGCN] Edge attention-based multi-relational graph convolutional networks [paper] [code]
  • [KDD2021] [WRGNN] Breaking the Limit of Graph Neural Networks by Improving the Assortativity of Graphs with Local Mixing Patterns [paper] [code]
  • [AAAI2020] [HetSANN] An Attention-based Graph Neural Network for Heterogeneous Structural Learning [paper] [code]
  • [AAAI2020] [TALP] Type-Aware Anchor Link Prediction across Heterogeneous Networks Based on Graph Attention Network [paper]
  • [KDD2019] [KGAT] KGAT: Knowledge Graph Attention Network for Recommendation [paper] [code]
  • [KDD2019] [GATNE] Representation Learning for Attributed Multiplex Heterogeneous Network [paper] [code]
  • [AAAI2021] [RelGNN] Relation-aware Graph Attention Model With Adaptive Self-adversarial Training [paper]
  • [KDD2020] [CGAT] Graph Attention Networks over Edge Content-Based Channels [paper]
  • [ACL2019] [AFE] Learning Attention-based Embeddings for Relation Prediction in Knowledge Graphs [paper] [code]
  • [CIKM2021] [DisenKGAT] DisenKGAT: Knowledge Graph Embedding with Disentangled Graph Attention Network [paper]
  • [AAAI2021] [GTAN] Graph-Based Tri-Attention Network for Answer Ranking in CQA [paper]
  • [ACL2020] [R-GAT] Relational Graph Attention Network for Aspect-based Sentiment Analysis [paper] [code]
  • [ICCV2019] [ReGAT] Relation-Aware Graph Attention Network for Visual Question Answering [paper] [code]
  • [AAAI2021] [AD-GAT] Modeling the Momentum Spillover Effect for Stock Prediction via Attribute-Driven Graph Attention Networks [paper]
  • [NeurIPS2018] [GAW] Watch your step: learning node embeddings via graph attention [paper]
  • [TPAMI2021] [NLGAT] Non-Local Graph Neural Networks [paper]
  • [NeurIPS2019] [ChebyGIN] Understanding Attention and Generalization in Graph Neural Networks [paper] [code]
  • [ICML2019] [SAGPool] Self-Attention Graph Pooling [paper] [code]
  • [ICCV2019] [Attpool] Attpool: Towards hierarchical feature representation in graph convolutional networks via attention mechanism [paper]
  • [WWW2019] [HAN] Heterogeneous Graph Attention Network [paper] [code]
  • [NC2022] [PSHGAN] Heterogeneous graph embedding by aggregating meta-path and meta-structure through attention mechanism [paper]
  • [IJCAI2017] [PRML] Link prediction via ranking metric dual-level attention network learning [paper]
  • [WSDM2022] [GraphHAM] Graph Embedding with Hierarchical Attentive Membership [paper]
  • [AAAI2020] [RGHAT] Relational Graph Neural Network with Hierarchical Attention for Knowledge Graph Completion [paper]
  • [AAAI2019] [LAN] Logic Attention Based Neighborhood Aggregation for Inductive Knowledge Graph Embedding [paper] [code]
  • [ICLR2022] [EFEGAT] Learning to Solve an Order Fulfillment Problem in Milliseconds with Edge-Feature-Embedded Graph Attention [paper]
  • [WWW2019] [DANSER] Dual Graph Attention Networks for Deep Latent Representation of Multifaceted Social Effects in Recommender Systems [paper] [code]
  • [WWW2019] [UVCAN] User-Video Co-Attention Network for Personalized Micro-video Recommendation [paper]
  • [KBS2020] [HAGERec] HAGERec: Hierarchical Attention Graph Convolutional Network Incorporating Knowledge Graph for Explainable Recommendation [paper]
  • [Bio2021] [GCATSL] Graph contextualized attention network for predicting synthetic lethality in human cancers [paper] [code]
  • [ICMR2020] [DAGC] DAGC: Employing Dual Attention and Graph Convolution for Point Cloud based Place Recognition [paper] [code]
  • [AAAI2020] [AGCN] Graph Attention Based Proposal 3D ConvNets for Action Detection [paper]
  • [EMNLP2019] [HGAT] HGAT: Heterogeneous Graph Attention Networks for Semi-supervised Short Text Classification [paper] [code]
  • [ICLR2020] [Hyper-SAGNN] Hyper-Sagnn: A Self-Attention based Graph Neural Network for Hypergraphs [paper] [code]
  • [CIKM2021] [HHGR] Double-Scale Self-Supervised Hypergraph Learning for Group Recommendation [paper] [code]
  • [ICDM2021] [HyperTeNet] HyperTeNet: Hypergraph and Transformer-based Neural Network for Personalized List Continuation [paper] [code]
  • [PR2020] [Hyper-GAT] Hypergraph Convolution and Hypergraph Attention [paper] [code]
  • [CVPR2020] [] Hypergraph attention networks for multimodal learning [paper]
  • [KDD2020] [DAGNN] Towards Deeper Graph Neural Networks [paper] [code]
  • [T-NNLS2020] [AP-GCN] Adaptive propagation graph convolutional network [paper] [code]
  • [CIKM2021] [TDGNN] Tree Decomposed Graph Neural Network [paper] [code]
  • [ICLR2022] [GAMLP] Graph Attention Multi-layer Perceptron [paper] [code]
  • [AAAI2021] [FAGCN] Beyond Low-frequency Information in Graph Convolutional Networks [paper] [code]
  • [arXiv2021] [ACM] Is Heterophily A Real Nightmare For Graph Neural Networks To Do Node Classification? [paper]
  • [KDD2020] [AM-GCN] AM-GCN: Adaptive Multi-channel Graph Convolutional Networks [paper] [code]
  • [AAAI2021] [UAG] Uncertainty-aware Attention Graph Neural Network for Defending Adversarial Attacks [paper]
  • [CIKM2021] [MV-GNN] Semi-Supervised and Self-Supervised Classification with Multi-View Graph Neural Networks [paper]
  • [KSEM2021] [GENet] Graph Ensemble Networks for Semi-supervised Embedding Learning [paper]
  • [NN2020] [MGAT] MGAT: Multi-view Graph Attention Networks [paper]
  • [CIKM2017] [MVE] An Attention-based Collaboration Framework for Multi-View Network Representation Learning [paper]
  • [NC2021] [EAGCN] Multi-view spectral graph convolution with consistent edge attention for molecular modeling [paper]
  • [SIGIR2020] [GCE-GNN] Global Context Enhanced Graph Neural Networks for Session-based Recommendation [paper]
  • [ICLR2019] [DySAT] Dynamic Graph Representation Learning via Self-Attention Networks [paper] [code]
  • [PAKDD2020] [TemporalGAT] TemporalGAT: Attention-Based Dynamic Graph Representation Learning [paper]
  • [IJCAI2021] [GAEN] GAEN: Graph Attention Evolving Networks [paper] [code]
  • [CIKM2019] [MMDNE] Temporal Network Embedding with Micro- and Macro-dynamics [paper] [code]
  • [ICLR2020] [TGAT] Inductive Representation Learning on Temporal Graphs [paper] [code]
  • [ICLR2022] [TR-GAT] Time-Aware Relational Graph Attention Network for Temporal Knowledge Graph Embeddings [paper]
  • [CVPR2022] [T-GNN] Adaptive Trajectory Prediction via Transferable GNN [paper]
  • [AAAI2018] [ST-GCN] Spatial Temporal Graph Convolutional Networks for Skeleton-Based Action Recognition [paper] [code]
  • [AAAI2020] [GMAN] GMAN: A Graph Multi-Attention Network for Traffic Prediction [paper]
  • [AAAI2019] [ASTGCN] Attention Based Spatial-Temporal Graph Convolutional Networks for Traffic Flow Forecasting [paper] [code]
  • [KDD2020] [ConSTGAT] ConSTGAT: Contextual Spatial-Temporal Graph Attention Network for Travel Time Estimation at Baidu Maps [paper]
  • [ICLR2022] [RainDrop] Graph-Guided Network for Irregularly Sampled Multivariate Time Series [paper]
  • [ICLR2021] [mTAND] Multi-Time Attention Networks for Irregularly Sampled Time Series [paper] [code]
  • [ICDM2020] [MTAD-GAT] Multivariate Time-series Anomaly Detection via Graph Attention Network [paper]
  • [WWW2020] [GACNN] Towards Fine-grained Flow Forecasting: A Graph Attention Approach for Bike Sharing Systems [paper]
  • [IJCAI2018] [GeoMAN] GeoMAN: Multi-level Attention Networks for Geo-sensory Time Series Prediction [paper]

GraphTransformers

  • [arXiv2019] [GTR] Graph transformer [paper]
  • [arXiv2020] [U2GNN] Universal Self-Attention Network for Graph Classification [paper] [code]
  • [WWW2022] [UGformer] Universal Graph Transformer Self-Attention Networks [paper] [code]
  • [ICLR2021] [GMT] Accurate learning of graph representations with graph multiset pooling [paper] [code]
  • [KDDCup2021] [Graphormer] Do Transformers Really Perform Bad for Graph Representation? [paper] [code]
  • [NeurIPS2021] [Graphormer] Do Transformers Really Perform Bad for Graph Representation? [paper] [code]
  • [NeurIPS2021] [HOT] Transformers Generalize DeepSets and Can be Extended to Graphs and Hypergraphs [paper] [code]
  • [NeurIPS2020] [GROVER] Self-Supervised Graph Transformer on Large-Scale Molecular Data [paper] [code]
  • [ICML(Workshop)2019] [PAGAT] Path-augmented graph transformer network [paper] [code]
  • [AAAI2021] [GTA] GTA: Graph Truncated Attention for Retrosynthesis [paper]
  • [AAAI2021] [GT] A generalization of transformer networks to graphs [paper] [code]
  • [NeurIPS2021] [SAN] Rethinking graph transformers with spectral attention [paper] [code]
  • [2020] [GraphBert] Graph-bert: Only attention is needed for learning graph representations [paper] [code]
  • [ICML2021] [] Lipschitz Normalization for Self-Attention Layers with Application to Graph Neural Networks [paper]
  • [IJCAI2021] [UniMP] Masked Label Prediction: Unified Message Passing Model for Semi-Supervised Classification [paper] [code]
  • [NeurIPS2019] [GTN] Graph Transformer Networks [paper] [code]
  • [KDD2020] [TagGen] A Data-Driven Graph Generative Model for Temporal Interaction Networks [paper]
  • [NeurIPS2021] [GraphFormers] GraphFormers: GNN-nested Transformers for Representation Learning on Textual Graph [paper]
  • [WWW2020] [HGT] Heterogeneous Graph Transformer [paper]
  • [AAAI2020] [GTOS] Graph transformer for graph-to-sequence learning [paper] [code]
  • [NAACL2019] [GraphWriter] Text Generation from Knowledge Graphs with Graph Transformers [paper] [code]
  • [AAAI2021] [KHGT] Knowledge-Enhanced Hierarchical Graph Transformer Network for Multi-Behavior Recommendation [paper] [code]
  • [AAAI2021] [GATE] GATE: Graph Attention Transformer Encoder for Cross-lingual Relation and Event Extraction [paper]
  • [NeurIPS2021] [STAGIN] Learning Dynamic Graph Representation of Brain Connectome with Spatio-Temporal Attention [paper] [code]

Next

Citation

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.