Skip to content
This repository has been archived by the owner on Oct 11, 2024. It is now read-only.

robertgshaw2-neuralmagic triggered nightly on refs/heads/upstream-sync-2024-06-23 #30

robertgshaw2-neuralmagic triggered nightly on refs/heads/upstream-sync-2024-06-23

robertgshaw2-neuralmagic triggered nightly on refs/heads/upstream-sync-2024-06-23 #30

Manually triggered June 24, 2024 23:29
Status Cancelled
Total duration 3h 11m 51s
Artifacts 16

nm-nightly.yml

on: workflow_dispatch
PYTHON-3-10  /  ...  /  BENCHMARK
19m 20s
PYTHON-3-10 / BENCHMARK / BENCHMARK
PYTHON-3-10  /  ...  /  TEST
57m 45s
PYTHON-3-10 / TEST-SOLO / TEST
PYTHON-3-10  /  ...  /  LM-EVAL
16m 36s
PYTHON-3-10 / LM-EVAL-SOLO / LM-EVAL
PYTHON-3-11  /  ...  /  BENCHMARK
18m 58s
PYTHON-3-11 / BENCHMARK / BENCHMARK
PYTHON-3-11  /  ...  /  TEST
1h 0m
PYTHON-3-11 / TEST-SOLO / TEST
PYTHON-3-11  /  ...  /  LM-EVAL
19m 34s
PYTHON-3-11 / LM-EVAL-SOLO / LM-EVAL
PYTHON-3-8  /  ...  /  BENCHMARK
18m 13s
PYTHON-3-8 / BENCHMARK / BENCHMARK
PYTHON-3-8  /  ...  /  TEST
1h 0m
PYTHON-3-8 / TEST-SOLO / TEST
PYTHON-3-8  /  ...  /  LM-EVAL
19m 20s
PYTHON-3-8 / LM-EVAL-SOLO / LM-EVAL
PYTHON-3-9  /  ...  /  BENCHMARK
21m 15s
PYTHON-3-9 / BENCHMARK / BENCHMARK
PYTHON-3-9  /  ...  /  TEST
1h 1m
PYTHON-3-9 / TEST-SOLO / TEST
PYTHON-3-9  /  ...  /  LM-EVAL
19m 56s
PYTHON-3-9 / LM-EVAL-SOLO / LM-EVAL
PYTHON-3-10  /  ...  /  BENCHMARK_REPORT
30s
PYTHON-3-10 / BENCHMARK / BENCHMARK_REPORT
PYTHON-3-11  /  ...  /  BENCHMARK_REPORT
29s
PYTHON-3-11 / BENCHMARK / BENCHMARK_REPORT
PYTHON-3-8  /  ...  /  BENCHMARK_REPORT
28s
PYTHON-3-8 / BENCHMARK / BENCHMARK_REPORT
PYTHON-3-9  /  ...  /  BENCHMARK_REPORT
39s
PYTHON-3-9 / BENCHMARK / BENCHMARK_REPORT
PYTHON-3-10  /  ...  /  PUBLISH
PYTHON-3-10 / UPLOAD / PUBLISH
PYTHON-3-11  /  ...  /  PUBLISH
PYTHON-3-11 / UPLOAD / PUBLISH
PYTHON-3-8  /  ...  /  PUBLISH
PYTHON-3-8 / UPLOAD / PUBLISH
PYTHON-3-9  /  ...  /  PUBLISH
PYTHON-3-9 / UPLOAD / PUBLISH
Fit to window
Zoom out
Zoom in

Annotations

4 errors
PYTHON-3-9 / TEST-SOLO / TEST
The run was canceled by @robertgshaw2-neuralmagic.
PYTHON-3-11 / TEST-SOLO / TEST
The run was canceled by @robertgshaw2-neuralmagic.
PYTHON-3-8 / TEST-SOLO / TEST
The run was canceled by @robertgshaw2-neuralmagic.
PYTHON-3-10 / TEST-SOLO / TEST
The run was canceled by @robertgshaw2-neuralmagic.

Artifacts

Produced during runtime
Name Size
3.10.12-nm-vllm-nightly-0.5.1.20240625.tar.gz Expired
698 KB
3.11.4-nm-vllm-nightly-0.5.1.20240625.tar.gz Expired
698 KB
3.8.17-nm-vllm-nightly-0.5.1.20240625.tar.gz Expired
698 KB
3.9.17-nm-vllm-nightly-0.5.1.20240625.tar.gz Expired
698 KB
9653854902-gcp-k8s-l4-solo-3.10.12 Expired
1.11 KB
9653854902-gcp-k8s-l4-solo-3.11.4 Expired
1.11 KB
9653854902-gcp-k8s-l4-solo-3.8.17 Expired
1.12 KB
9653854902-gcp-k8s-l4-solo-3.9.17 Expired
1.11 KB
gh_action_benchmark_jsons-9653854902-gcp-k8s-l4-solo-3.10.12 Expired
1 KB
gh_action_benchmark_jsons-9653854902-gcp-k8s-l4-solo-3.11.4 Expired
1021 Bytes
gh_action_benchmark_jsons-9653854902-gcp-k8s-l4-solo-3.8.17 Expired
1 KB
gh_action_benchmark_jsons-9653854902-gcp-k8s-l4-solo-3.9.17 Expired
1 KB
nm_vllm_nightly-0.5.1.20240624-cp39-cp39-manylinux_2_17_x86_64.whl Expired
179 MB
nm_vllm_nightly-0.5.1.20240625-cp310-cp310-manylinux_2_17_x86_64.whl Expired
179 MB
nm_vllm_nightly-0.5.1.20240625-cp311-cp311-manylinux_2_17_x86_64.whl Expired
179 MB
nm_vllm_nightly-0.5.1.20240625-cp38-cp38-manylinux_2_17_x86_64.whl Expired
179 MB