GithubHelp home page GithubHelp logo

gayanku / fdgatii Goto Github PK

View Code? Open in Web Editor NEW
2.0 2.0 0.0 5.54 MB

Pytorch implementation of FDGATII : Fast Dynamic Graph Attention with Initial Residual and Identity Mapping (https://arxiv.org/abs/2110.11464)

License: MIT License

Python 98.56% Shell 1.44%

fdgatii's Introduction

Fast Dynamic Graph Attention with Initial Residual and Identity Mapping

License: MIT

This repository contains a PyTorch implementation of "FDGATII : Fast Dynamic Graph Attention with Initial Residual and Identity Mapping".(https://arxiv.org/abs/2110.11464)

FDGATII combines 3 main enhancements on GAT, namely a default (safety net) initial representation is provided that is used when no similar class or node is present in the neighbourhood, so that graph attention mechanism will not fail. A more expressive universal dynamic attention is used. Finally, string regularization is applied, controlled by the hyperparameter β. FDGATII concepts

The repo, inlcuding data and datasplits used for the 10 ierations, has been forked initially from GCNII. We use the sparse (static) GATv1 attention code from pyGAT and modified dynamic attention as in GATv2.

Dependencies

  • CUDA 11.3.0
  • python 3.6.9
  • pytorch 1.3.1

Note : FDGATII is able to run with no GPU if the GUP timing code is commented out, and then will not require CUDA.

Datasets

The data folder contains three benchmark datasets(Cora, Citeseer, Pubmed), and the new_data folder contains four datasets(Chameleon, Cornell, Texas, Wisconsin) from Geom-GCN. We use the same same full-supervised setting, and data splits, as Geom-GCN and GCNII.

Results

Testing accuracy is summarized below. We have used the 10 standard data splits ( Split 0 - 9) and obtained the Avarage Accuracy and Standard deviation.

Dataset Depth Dimensions Accuracy Std.D Split 0 Split 1 Split 2 Split 3 Split 4 Split 5 Split 6 Split 7 Split 8 Split 9
Cora 2 64 87.7867 1.149 87.1227 89.1348 88.7324 87.7264 87.7264 86.3179 85.5131 89.1348 87.7264 88.7324
Cite 1 128 75.6434 1.8721 73.7237 75.0751 75.5255 74.9249 79.2453 73.5849 74.1742 78.979 75.6757 75.5255
Pubm 2 64 90.3524 0.297 90.5426 90.644 90.213 89.858 90.5426 90.8215 90.4412 90.2383 89.8834 90.3398
Cham 1 64 65.1754 1.8105 67.1053 66.2281 61.4035 63.5965 65.5702 64.4737 64.2544 64.693 67.9825 66.4474
Corn 1 128 82.4324 6.3095 67.5676 83.7838 91.8919 86.4865 86.4865 83.7838 83.7838 83.7838 75.6757 81.0811
Texa 2 64 82.1622 2.7562 81.0811 83.7838 81.0811 86.4865 81.0811 78.3784 83.7838 81.0811 78.3784 86.4865
Wisc 1 128 86.2745 4.4713 86.2745 80.3922 88.2353 92.1569 90.1961 82.3529 82.3529 84.3137 82.3529 94.1176

Usage

  • All parameters are defined in fullSupervised_01.py.

  • To run FDGATII on cora, for 1 iteration only use

python -u fullSupervised_01.py --data cora --layer 2 --alpha 0.2 --weight_decay 1e-4 --epochs 1500 --iterations 1 --mode FDGATII --support 1 --verbosity 1 --model GCNII_BASE
  • To replicate the FDGATII full-supervised results, run the following script
#!/bin/bash
SCRIPT='python3.6 fullSupervised_01.py'
SETTTINGS=" --epochs 1500 --iterations 10 --mode FDGATII --verbosity 0 --model GCNII_BASE "

$SCRIPT --data cora --support 1 --layer 2 --hidden 64 --alpha 0.2 --weight_decay 1e-4 $SETTTINGS
$SCRIPT --data citeseer --support 2 --layer 1 --hidden 128  --weight_decay 5e-6 $SETTTINGS
$SCRIPT --data pubmed --support 1 --layer 2 --hidden 64  --alpha 0.1 --weight_decay 5e-6 $SETTTINGS
$SCRIPT --data chameleon --support 0 --layer 1 --hidden 64 --lamda 1.5 --alpha 0.2 --weight_decay 5e-4 $SETTTINGS
$SCRIPT --data cornell --support 2 --layer 1 --hidden  128 --lamda 1 --weight_decay 1e-3 $SETTTINGS
$SCRIPT --data texas --support 2 --layer 2 --hidden 64 --lamda 1.5 --weight_decay 1e-4 $SETTTINGS
$SCRIPT --data wisconsin --support 2 --layer 1 --hidden 128 --lamda 1 --weight_decay 5e-4 $SETTTINGS

Data sources and code

Datasets and code is forked from GCNII which uses the data set and parts of code from Geom-GCN. We use sparse attention implementation code from pyGAT as discribed in GAT . We acknowledge and thank the authors of these works for sharing their code.

Citation

@article{kulatilleke2021fdgatii,
  title={FDGATII: Fast Dynamic Graph Attention with Initial Residual and Identity Mapping},
  author={Kulatilleke, Gayan K and Portmann, Marius and Ko, Ryan and Chandra, Shekhar S},
  journal={arXiv preprint arXiv:2110.11464},
  year={2021}
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.