You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi , In the paper , they mentioned a paragraph about persistent memory and the results they got with them. I was wondering about how to keep the external memory persistent and not wipe it off across episodes if i have a different label space across episodes . Is it even possible to work it around in such a way ?
The text was updated successfully, but these errors were encountered:
Hi! Here I am indeed resetting the memory at the beginning of each episode. To have a persistent memory across episodes, you can turn the initial memory M_0 into a tensor, and return the final memory state in memory_augmented_neural_network (maybe l_ntm_var[0][-1]).
Let me know if you have any trouble making that change.
Hi , In the paper , they mentioned a paragraph about persistent memory and the results they got with them. I was wondering about how to keep the external memory persistent and not wipe it off across episodes if i have a different label space across episodes . Is it even possible to work it around in such a way ?
The text was updated successfully, but these errors were encountered: