BEiT (ICLR'2022)
@article{bao2021beit,
title={Beit: Bert pre-training of image transformers},
author={Bao, Hangbo and Dong, Li and Wei, Furu},
journal={arXiv preprint arXiv:2106.08254},
year={2021}
}
Segmentor | Pretrain | Backbone | Crop Size | Schedule | Train/Eval Set | mIoU | Download |
---|---|---|---|---|---|---|---|
UperNet | ImageNet-22k-224x224 | BEiT-B | 640x640 | LR/POLICY/BS/EPOCH: 3e-5/poly/16/130 | train/val | 53.12% | cfg | model | log |
UperNet | ImageNet-22k-224x224 | BEiT-L | 640x640 | LR/POLICY/BS/EPOCH: 3e-5/poly/16/130 | train/val | 56.52% | cfg | model | log |
You can also download the model weights from following sources:
- BaiduNetdisk: https://pan.baidu.com/s/1gD-NJJWOtaHCtB0qHE79rA with access code s757