This repository contains official implementation of the paper Shadow Cones: Unveiling Partial Orders in Hyperbolic Space. Some of our code in data loading part is adopted from hyperbolic_cones.
Our implementation works with Python>=3.9 and PyTorch>=1.12.1. We use HTorch
for optimization within hyperbolic space of different models. Please refer to HTorch repo for installation.
To install other dependencies, use: $ pip install -r requirement.txt
We provide the WordNet datasets (mammal and noun) under data_utils/data/maxn/
, which
are the same as those used in entailment cone.
Due to space limit, ConceptNet and hearst datasets are stored on Google Drive. Please download
with gdown
and move them to data_utils/data/MCG
and data_utils/data/hearst
:
pip install gdown
gdown --no-check-certificate --folder https://drive.google.com/drive/folders/1WH2LIk2EsTe_lQ03AjCaxZ3o8fSkNt1f?usp=sharing
We use train.py
to train on small datasets (e.g., mammal.) with single process, and train_hogwild_lazy.py
to
train on large datasets (e.g., noun, MCG and hearst) with multi-processing. We provide commands and
hyper-parameter guideline in run.sh
, for training different shadow cones on specified datasets.
bash run.sh
- Tao Yu*, [email protected]
- Toni J.B. Liu*, [email protected]
- Albert Tseng, [email protected]
- Christopher De Sa, [email protected]
If you find our works helpful in your research, please consider citing us:
@article{yu2023shadow,
title={Shadow Cones: Unveiling Partial Orders in Hyperbolic Space},
author={Yu, Tao and Liu, Toni JB and Tseng, Albert and De Sa, Christopher},
journal={arXiv preprint arXiv:2305.15215},
year={2023}
}