Prior-RadGraphFormer: A Prior-Knowledge-Enhanced Transformer for Generating Radiology Graphs from X-Rays
Code release for the GRAIL @ MICCAI 2023 paper "Prior-RadGraphFormer: A Prior-Knowledge-Enhanced Transformer for Generating Radiology Graphs from X-Rays".
Prior-RadGraphformer is a transformer-based network aiming at directly generating radiology graphs from radiology X-rays. Generated graphs can be used for multiple downstream tasks such as free-text reports generation and pathologies classification.
We recommend using python3.8 and following scripts to install required python packages and compile CUDA operators
python -m venv /path/to/new/virtual/environment
source /path/to/new/virtual/environment/bin/activate
pip install -r requirements.txt
cd ./models/ops
python setup.py install
The config file can be found at .configs/radgraph.yaml
. Make custom changes if necessary.
python train.py
If you find this code helpful, please consider citing:
@misc{xiong2023priorradgraphformer,
title={Prior-RadGraphFormer: A Prior-Knowledge-Enhanced Transformer for Generating Radiology Graphs from X-Rays},
author={Yiheng Xiong and Jingsong Liu and Kamilia Zaripova and Sahand Sharifzadeh and Matthias Keicher and Nassir Navab},
year={2023},
eprint={2303.13818},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
This code borrows heavily from Relationformer, Classification by Attention. We thank the authors for their great work.
The authors gratefully acknowledge the financial support by the Federal Ministry of Education and Research of Germany (BMBF) under project DIVA (FKZ 13GW0469C). Kamilia Zaripova was partially supported by the Linde & Munich Data Science Institute, Technical University of Munich Ph.D. Fellowship.