-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
code for inference on custom images #8
Comments
Thanks for your interest. We have uploaded the inference scripts for single-image virtualization. For the videos, you need to do inference frame by frame. |
Could you please explain how does this script work? I have put an image in Inference_Path and annotations from Coco2017 dataset to EDPOSE_COCO_PATH. I tried to run the script Virtualization via COCO Keypoints Format, but I got the following error:
and the process hangs infinitely. |
You need use the command: export Inference_Path=/path/to/your/inference_dir |
I have already done it. I also downloaded COCO annotations and put them in EDPOSE_COCO_PATH, but it doesn't work. Why does inference code on one image need the whole COCO dataset...? |
Well, is there a simple way to just load the model and infer on a single image without using the dataloader? I tried to do this on a torch tensor, but there were some errors. |
is there the code to run the model on custom images or video?
The text was updated successfully, but these errors were encountered: