Skip to content

Graph Transformer Networks (Authors' PyTorch implementation for the NeurIPS 19 paper)

Notifications You must be signed in to change notification settings

ThanhDucPham/Graph_Transformer_Networks

 
 

Repository files navigation

Graph Transformer Networks

This repository is the implementation of Graph Transformer Networks(GTN).

Seongjun Yun, Minbyul Jeong, Raehyun Kim, Jaewoo Kang, Hyunwoo J. Kim, Graph Transformer Networks, In Advances in Neural Information Processing Systems (NeurIPS 2019).

Dependencies

Data Preprocessing

We used datasets from Heterogeneous Graph Attention Networks (Xiao Wang et al.) and uploaded the preprocessing code of acm data as an example.

Running the code

$ mkdir data
$ cd data

Download datasets (DBLP, ACM, IMDB) from this link and extract data.zip into data folder.

$ cd ..
  • DBLP
$ python main.py --dataset DBLP --num_layers 3
  • ACM
 $ python main.py --dataset ACM --num_layers 2 --adaptive_lr true
  • IMDB
 $ python main_sparse.py --dataset IMDB --num_layers 3 --adaptive_lr true

Citation

If this work is useful for your research, please cite our paper:

@inproceedings{yun2019graph,
  title={Graph Transformer Networks},
  author={Yun, Seongjun and Jeong, Minbyul and Kim, Raehyun and Kang, Jaewoo and Kim, Hyunwoo J},
  booktitle={Advances in Neural Information Processing Systems},
  pages={11960--11970},
  year={2019}
}

About

Graph Transformer Networks (Authors' PyTorch implementation for the NeurIPS 19 paper)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 80.1%
  • Python 19.9%