You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
First of all, thank you for the code for test perplexity calculation extending the GibbsLDA++. It will be very useful for my work.
I am observing the calculation of perplexity and it seems that you are applying the logarithm before the sum of all words probabilities in the document. Since the sum of logs is not equivalent the the log of sum, and the perplexity function takes into account the log of sum, your calculation may be incorrect.
Is my observation relevant?
Cheers,
Marcelo Pita
The text was updated successfully, but these errors were encountered:
I had the same intuition like you. But indeed, the calculation in the code is correct.
You might have heard of the famous paper "Parameter estimation for text analysis". In section 7.3 there is a detailed derivation for LDA-perplexity. And in fact it turns out that you log the single term-probabilities.
Hi,
First of all, thank you for the code for test perplexity calculation extending the GibbsLDA++. It will be very useful for my work.
I am observing the calculation of perplexity and it seems that you are applying the logarithm before the sum of all words probabilities in the document. Since the sum of logs is not equivalent the the log of sum, and the perplexity function takes into account the log of sum, your calculation may be incorrect.
Is my observation relevant?
Cheers,
Marcelo Pita
The text was updated successfully, but these errors were encountered: