You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
However, I've observed that, in the first batch, the codes I got were uniformly distributed. In the second and following batches, the codes and features within codes_sparse and quantized_sparse were almost the same. The following codes were I got for the second batch
Note that the input features within sparse_feat are not alike, so I am wondering what could go wrong here? Am I not configuring the quantization procedure right?
Looking forward to your helpful suggestions. Appreciated in advance.
The text was updated successfully, but these errors were encountered:
Hi there, I am trying to quantize my input feature
sparse_feat
with the following codes in my network.However, I've observed that, in the first batch, the codes I got were uniformly distributed. In the second and following batches, the codes and features within
codes_sparse
andquantized_sparse
were almost the same. The following codes were I got for the second batchNote that the input features within
sparse_feat
are not alike, so I am wondering what could go wrong here? Am I not configuring the quantization procedure right?Looking forward to your helpful suggestions. Appreciated in advance.
The text was updated successfully, but these errors were encountered: