Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Q value is always zero for any new model I train #43

Open
amritkochar opened this issue Jun 14, 2019 · 5 comments
Open

Q value is always zero for any new model I train #43

amritkochar opened this issue Jun 14, 2019 · 5 comments

Comments

@amritkochar
Copy link

Hello team,

When I use the GQ-Image-Wise model, for each grasp the q value is generated very well and according to the grasp quality. But, if I retrain or fine-tune a new model with some modifications, the q value for any grasp is always zero.

Why? Please, help me in solving this.

@jeffmahler
Copy link
Contributor

Hello, please provide more details. Have you looked at the raw q_values in the numpy array output by the data generation script? It's possible that they are quite small and you will need to change the training threshold.

@amritkochar
Copy link
Author

Hello Jeff,

If I understood correctly, I went to the dataset generated by running generate_gqcnn_dataset.py, and in the tensors folder when I check the values named robust_ferrari_canny_00XXX.npz, most of the values are very very small, and in all the np arrays, the max value is around 0.003-0.005.

In the generate_gqcnn_dataset.yaml file, these are the parameter values:

Dataset gen params

images_per_stable_pose: 50
stable_pose_min_p: 0.0

Also, when I use the generated dataset to train, in the training.yaml file, these are the variable values:

target_metric_name: robust_ferrari_canny
metric_thresh: 0.002

Let me know what am I doing wrong.

Thanks!

@amritkochar
Copy link
Author

Hello team,

Anything on this? I tried to change a few of the values but it still doesn't help solve the problem.

Thanks.

@jeffmahler
Copy link
Contributor

@amrit-007 Based on the information you provided, I suspect that the grasp metrics are being computed as expected and you are using difficult objects. Usually, the max is about 0.005 for a dataset.

There are some issues when training with a small number of positive examples, which is why you may be seeing all zeros. Here are a few potential fixes:

  1. Increase the batch size to 256 or greater
  2. Reduce the "metric_thresh"

@AbdulghaniAltaweel
Copy link

AbdulghaniAltaweel commented Sep 13, 2019

I am trying to generate training data for GQ-CNN. I got small values of robust_ferrari_canny_metric like 2e-6. To check where is the Mistake I tried the dexnet-code on the example and dexnet_2 databases and recompute the metric for some objects and got small values too. so Any suggestions ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants