Skip to content

[Bugfix] Fix FP8 torch._scaled_mm fallback for torch>2.5 with CUDA<12.4 #649

[Bugfix] Fix FP8 torch._scaled_mm fallback for torch>2.5 with CUDA<12.4

[Bugfix] Fix FP8 torch._scaled_mm fallback for torch>2.5 with CUDA<12.4 #649

Triggered via pull request November 7, 2024 00:51
@comaniaccomaniac
auto_merge_enabled #10095
Status Success
Total duration 12s
Artifacts

add_label_automerge.yml

on: pull_request_target
add-label-on-auto-merge
2s
add-label-on-auto-merge
Fit to window
Zoom out
Zoom in