Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: weight tensor should be defined either for all 8 classes or no classes but got weight tensor of shape: [1, 8] #668

Open
3 tasks done
makaay2077 opened this issue Jan 7, 2025 · 0 comments
Labels
question Further information is requested

Comments

@makaay2077
Copy link

makaay2077 commented Jan 7, 2025

Checklist

My Question

I trained my own data before and got good results and it worked fine. However, when I try to retrain the same data with the same parameters a few months later, I get the following error. What could be the reason and solution?

RuntimeError Traceback (most recent call last)
Cell In[3], line 18
15 pipeline = SemanticSegmentation(model=model, dataset=dataset, **cfg.pipeline , device='cuda') #max_epoch=100,
17 # prints training progress in the console.
---> 18 pipeline.run_train()

File ~/Masaüstü/Open3D-ML-0.17.0/ml3d/torch/pipelines/semantic_segmentation.py:410, in SemanticSegmentation.run_train(self)
408 self.optimizer.zero_grad()
409 results = model(inputs['data'])
--> 410 loss, gt_labels, predict_scores = model.get_loss(
411 Loss, results, inputs, device)
413 if predict_scores.size()[-1] == 0:
414 continue

File ~/Masaüstü/Open3D-ML-0.17.0/ml3d/torch/models/randlanet.py:378, in RandLANet.get_loss(self, Loss, results, inputs, device)
373 labels = inputs['data']['labels']
375 scores, labels = filter_valid_label(results, labels, cfg.num_classes,
376 cfg.ignored_label_inds, device)
--> 378 loss = Loss.weighted_CrossEntropyLoss(scores, labels)
380 return loss, labels, scores

File ~/anaconda3/envs/bimtas/lib/python3.9/site-packages/torch/nn/modules/module.py:1194, in Module._call_impl(self, *input, **kwargs)
1190 # If we don't have any hooks, we want to skip the rest of the logic in
1191 # this function, and just call forward.
1192 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1193 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1194 return forward_call(*input, **kwargs)
1195 # Do not call functions when jit is used
1196 full_backward_hooks, non_full_backward_hooks = [], []

File ~/anaconda3/envs/bimtas/lib/python3.9/site-packages/torch/nn/modules/loss.py:1174, in CrossEntropyLoss.forward(self, input, target)
1173 def forward(self, input: Tensor, target: Tensor) -> Tensor:
-> 1174 return F.cross_entropy(input, target, weight=self.weight,
1175 ignore_index=self.ignore_index, reduction=self.reduction,
1176 label_smoothing=self.label_smoothing)

File ~/anaconda3/envs/bimtas/lib/python3.9/site-packages/torch/nn/functional.py:3026, in cross_entropy(input, target, weight, size_average, ignore_index, reduce, reduction, label_smoothing)
3024 if size_average is not None or reduce is not None:
3025 reduction = _Reduction.legacy_get_string(size_average, reduce)
-> 3026 return torch._C._nn.cross_entropy_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index, label_smoothing)

RuntimeError: weight tensor should be defined either for all 8 classes or no classes but got weight tensor of shape: [1, 8]

@makaay2077 makaay2077 added the question Further information is requested label Jan 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant