Skip to content

[Bugfix] Fix FP8 torch._scaled_mm fallback for torch>2.5 with CUDA<12.4 #649

[Bugfix] Fix FP8 torch._scaled_mm fallback for torch>2.5 with CUDA<12.4

[Bugfix] Fix FP8 torch._scaled_mm fallback for torch>2.5 with CUDA<12.4 #649

add-label-on-auto-merge

succeeded Nov 7, 2024 in 2s