-
Notifications
You must be signed in to change notification settings - Fork 232
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some clarification required #130
Comments
Hi @inspirit , |
Hey @MisterBourbaki, you right and as i pointed above even with negative aux_loss the training progresses as expected with reduction of overall reconstruction error, but we still need to carefully select weight for entropy part of aux_loss since it affects code book usage and overall reconstruction error. |
the reconstruction loss over VQ is a positive value, while the LFQ returns a negtive value
have you found a good reconstruction loss weight there? I have the same issure. We are training a VQVAE-GAN, as a result of the negtive value of aux_loss, the whole generative loss is a negative value. |
Hi Phil,
I was experimenting with FSQ/LFQ for 3d-motion autoencoder and was wondering how to work through LFQ variety of options?
With FSQ its quite straight forward as it does not have any losses and you just need to find suitable levels for particular task.
However when I start training LFQ immediately I face negative aux_loss issue which is of course due to entropy_aux_loss.
I understand that we can combat it with decreasing diversity but i dont like the idea, another approach would be lowering entropy_loss_weight which can lead to poor codebook utilisation i believe...
So what are the options to tune LFQ parameters? btw even with negative aux_loss it still seems to train fine, at least reconstruction loss goes down
The text was updated successfully, but these errors were encountered: