-
Notifications
You must be signed in to change notification settings - Fork 49
Performance problem combination of SSG + SPJ #10
Comments
It looks like your SPJ is not being trained with the random negative sampling in this instance. The model reported in the paper is the To replicate the results from the paper the table is an average of 5 different random initialisations of the model. But, I trained with SEED=1 using the scripts from the repo last night and got these numbers when testing on the SSG predictions. You can download my model checkpoint from this Google Drive folder: https://drive.google.com/drive/folders/1Goyx1KST01nritrm4CoAOl1KCJYOc43P?usp=sharing
|
Hi James, |
Hi @PierreGAubay , Andrea |
Hi @andreabac3, Hope it will help ! |
Hi again @andreabac3 |
Hi @PierreGAubay, Thank you very much, for your help and kindness. Sincerely, |
Hey,
My team and I are facing an issue with the concatenation of SSG and SPJ. We trained the SSG and SPJ. The performances are quite good taken separatly. But, as soon as we test the NRD globally, the performances drop down. We have 0.55, 087 precision and recall for the SSG, 0.89 F1 score for the SPJ but 0.131 for the accumulation of SSG + SPJ. Based on the table from the Neural Databases article, we expected to have better results. Do you have any idea why this is happening ?
Why can't we predict the other types of questions for SSG + SPJ ... ?
Thanks :)
The text was updated successfully, but these errors were encountered: