-
Notifications
You must be signed in to change notification settings - Fork 180
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
the Usage sample got an invalid size error #40
Comments
Did you use this code? import torch
import torch.nn as nn
from conformer import Conformer
batch_size, sequence_length, dim = 3, 12345, 80
cuda = torch.cuda.is_available()
device = torch.device('cuda' if cuda else 'cpu')
inputs = torch.rand(batch_size, sequence_length, dim).to(device)
input_lengths = torch.IntTensor([12345, 12300, 12000])
targets = torch.LongTensor([[1, 3, 3, 3, 3, 3, 4, 5, 6, 2],
[1, 3, 3, 3, 3, 3, 4, 5, 2, 0],
[1, 3, 3, 3, 3, 3, 4, 2, 0, 0]]).to(device)
target_lengths = torch.LongTensor([9, 8, 7])
model = nn.DataParallel(Conformer(num_classes=10, input_dim=dim,
encoder_dim=32, num_encoder_layers=3,
decoder_dim=32)).to(device)
# Forward propagate
outputs = model(inputs, input_lengths, targets, target_lengths)
# Recognize input speech
outputs = model.module.recognize(inputs, input_lengths) |
Yes. |
Do you want to try it without nn.DataParallel? |
thx! OK now. batch_size, sequence_length, dim = 3, 12345, 80 cuda = torch.cuda.is_available() inputs = torch.rand(batch_size, sequence_length, dim).to(device) model = Conformer(num_classes=10, input_dim=dim, Forward propagateoutputs = model(inputs, input_lengths, targets, target_lengths) Recognize input speechoutputs = model.recognize(inputs, input_lengths) |
I'm operating normally. It's weird. |
I have installed the package and after run the usage sample. there comes an error like following:
RuntimeError: Gather got an input of invalid size: got [1, 3085, 8, 10], but expected [1, 3085, 9, 10]
Could you tell me where is my mistake?
The text was updated successfully, but these errors were encountered: