The hidden danger of manhole cover detection based on KDWC-YOLOv5(Knowledge Distillation Well Cover-YOLOv5)
To-do list:
- Model Ensembling ultralytics/yolov5#318
- Shape IOU https://github.com/malagoutou/Shape-IoU
- Multi-scale training
- Knowledge distillation https://blog.roboflow.com/what-is-knowledge-distillation/
- kepp training model, delving into best epoch and hyperparameter settings
- Publish dataets after the competition ends.
- Publish a web app made by my team.
-
Linux
-
Python>=3.7
-
NVIDIA GPU (memory>=30G) + CUDA cuDNN
pip install -r requirements.txt
Our model‘s best checkpoint and dataset are located at the links below, you can download them freely.
Checkpoint: https://drive.google.com/file/d/1fclRgDYc_duWns63MbTeKRffmSPdP7BA/view?usp=sharing
Dataset: https://drive.google.com/file/d/16f29aRAM8zAsiaks8zuPdAzIkuEz1RKA/view?usp=sharing
If you want to get the mAP value, run the following command:
python val.py
If you want to get the images with bounding boxes, run the following command:
python detect.py
If you have a GPU cluster, I also provide you with a script file using sbatch to submit, and you can run it with:
sbatch yolov5_val.sh/sbatch yolov5_detect.sh
Tips: You can also use "--" to add parameters in the running command according to yourslef. E.g. If I want to output a txt file with the order of "Image name Confidence coefficient Coordinates", you can run the command below:
python detect.py --save-txt --save-conf
If you want to train our model by yourself, you should firstly change the specify the path of your dataset in "data/A30.yaml", and you also nedd to specify a pretrained model, we use yolov5m, or you can choose other pretrained model via official link, then can run the following command:
python train.py
Tip1: You can also use "--" to add multi-scale parameters in the running command if you want to multi-scale training:
python train.py --multi-scale
Tip2: It is better to put the pretrained model under the root directory.
Web_APP_demo.mp4
1656_1715756946.mp4
⭐If you want to get the this app's developing codes or have any other questions, please feel free to conatact [email protected].