-
Notifications
You must be signed in to change notification settings - Fork 5
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[CI] - add pull request template and CI testing automation with githu…
…b actions (#2)
- Loading branch information
Showing
5 changed files
with
279 additions
and
2 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,26 @@ | ||
## Motivation and Context | ||
|
||
<!--- Why is this change required? What problem does it solve? --> | ||
<!--- Link to existing issues or related PRs here if they exists. --> | ||
|
||
## How Has This Been Tested | ||
|
||
<!--- Please describe here how your modifications have been tested. --> | ||
|
||
## Types of changes | ||
|
||
<!--- What types of changes does your code introduce? Put an `x` in all the boxes that apply: --> | ||
- [ ] Docs change / refactoring / dependency upgrade | ||
- [ ] Bug fix (non-breaking change which fixes an issue) | ||
- [ ] New feature (non-breaking change which adds functionality) | ||
- [ ] Breaking change (fix or feature that would cause existing functionality to change) | ||
|
||
## Checklist | ||
|
||
<!--- Go over all the following points, and put an `x` in all the boxes that apply. --> | ||
|
||
- [ ] My code follows the code style of this project. | ||
- [ ] My change requires a change to the documentation. | ||
- [ ] I have updated the documentation accordingly. | ||
- [ ] I have added tests to cover my changes. | ||
- [ ] All new and existing tests passed. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,62 @@ | ||
name: install_ubuntu_deps | ||
runs: | ||
using: composite | ||
steps: | ||
- name: install cmake | ||
# OpenEXR requires CMake 3.12. | ||
run: | | ||
echo $(date +%F) > ./date | ||
echo $(git ls-remote https://github.com/facebookresearch/partnr-planner.git HEAD | awk '{ print $1}') > ./partnr_sha | ||
cat ./partnr_sha | ||
wget https://cmake.org/files/v3.12/cmake-3.12.4-Linux-x86_64.sh | ||
sudo mkdir /opt/cmake312 | ||
sudo sh ./cmake-3.12.4-Linux-x86_64.sh --prefix=/opt/cmake312 --skip-license | ||
sudo ln -s /opt/cmake312/bin/cmake /usr/local/bin/cmake | ||
sudo ln -s /opt/cmake312/bin/ctest /usr/local/bin/ctest | ||
shell: bash | ||
- name: Install dependencies | ||
run: |- | ||
echo "Install dependencies" | ||
sudo apt-get update || true | ||
sudo apt-get install -y --no-install-recommends \ | ||
build-essential \ | ||
git \ | ||
git-lfs \ | ||
curl \ | ||
vim \ | ||
ca-certificates \ | ||
libjpeg-dev \ | ||
libglm-dev \ | ||
libegl1-mesa-dev \ | ||
ninja-build \ | ||
xorg-dev \ | ||
freeglut3-dev \ | ||
pkg-config \ | ||
wget \ | ||
zip \ | ||
lcov\ | ||
libhdf5-dev \ | ||
libomp-dev \ | ||
unzip || true | ||
#install doc build deps | ||
sudo apt install --yes --allow-change-held-packages \ | ||
texlive-base \ | ||
texlive-latex-extra \ | ||
texlive-fonts-extra \ | ||
texlive-fonts-recommended | ||
shell: bash | ||
- name: Setup miniconda | ||
uses: conda-incubator/[email protected] | ||
with: | ||
miniconda-version: "latest" | ||
python-version: "3.9" | ||
activate-environment: "partnr" | ||
- name: Install conda dependencies | ||
run: |- | ||
echo "Install conda and dependencies" | ||
conda install -q -y mkl==2021.4.0 | ||
conda install -q -y -c conda-forge ninja ccache | ||
conda install -q -y -c conda-forge libglib=2.76.1 glib=2.76.1 glib-tools=2.76.1 | ||
conda install -y -c conda-forge numpy==1.26.4 pytest pytest-cov hypothesis pytest-mock | ||
pip install pytest-sugar pytest-xdist pytest-benchmark opencv-python cython mock | ||
shell: bash -el {0} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,24 @@ | ||
# This action installs required Ubuntu NVIDIA dependencies. | ||
# It assumes that `nvidia-smi` is available on the system. | ||
name: "Install GPU Dependencies" | ||
runs: | ||
using: composite | ||
steps: | ||
- name: Install GPU Dependencies | ||
shell: bash | ||
run: | | ||
NVIDIA_SMI_OUTPUT=$(nvidia-smi) | ||
# Extract CUDA version (e.g., 11.7) | ||
CUDA_VERSION=$(echo "$NVIDIA_SMI_OUTPUT" | grep -oP 'CUDA Version:\s+\K[\d.]+') | ||
FULL_DRIVER_VERSION=$(echo "$NVIDIA_SMI_OUTPUT" | grep -oP 'Driver Version:\s+\K[\d.]+') | ||
MAJOR_DRIVER_VERSION=$(echo "$NVIDIA_SMI_OUTPUT" | grep -oP 'Driver Version:\s+\K\d+') | ||
UBUNTU_PACKAGE_SUFFIX="${MAJOR_DRIVER_VERSION}=${FULL_DRIVER_VERSION}-0ubuntu1" | ||
sudo apt-get update | ||
sudo apt-get install -y cuda-toolkit-${CUDA_VERSION} | ||
sudo apt-get install -y nvidia-gds-${CUDA_VERSION} | ||
sudo apt-get install -y libnvidia-common-${UBUNTU_PACKAGE_SUFFIX} | ||
sudo apt-get install -y libnvidia-gl-${UBUNTU_PACKAGE_SUFFIX} | ||
sudo apt-get install -y libnvidia-compute-${UBUNTU_PACKAGE_SUFFIX} --allow-downgrades | ||
sudo apt-get install -y libnvidia-decode-${UBUNTU_PACKAGE_SUFFIX} | ||
sudo apt-get install -y libnvidia-encode-${UBUNTU_PACKAGE_SUFFIX} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,165 @@ | ||
name: Install and test | ||
on: | ||
pull_request: {} | ||
push: | ||
branches: | ||
- main | ||
tags: [ "v*" ] | ||
schedule: | ||
- cron: "0 5 * * *" | ||
#this is 9PM PST | ||
|
||
jobs: | ||
pre-commit: | ||
runs-on: ubuntu-latest | ||
steps: | ||
- uses: actions/[email protected] | ||
- name: Setup python | ||
uses: actions/[email protected] | ||
with: | ||
python-version: '3.9.16' | ||
- name: install dependencies | ||
run: |- | ||
pip install -U pip setuptools pre-commit | ||
# Install the hooks now so that they'll be cached | ||
pre-commit install-hooks | ||
- name: Check Code Style using pre-commit | ||
run: |- | ||
SKIP=eslint pre-commit run --show-diff-on-failure --all-files | ||
python_lint: | ||
runs-on: ubuntu-latest | ||
steps: | ||
- uses: actions/[email protected] | ||
- name: Setup python | ||
uses: actions/[email protected] | ||
with: | ||
python-version: '3.9.16' | ||
- name: setup | ||
run: |- | ||
pip install black==23.1.0 --progress-bar off | ||
pip install "isort[pyproject]" numpy --progress-bar off | ||
pip install mypy==0.991 types-mock types-Pillow types-tqdm types-PyYAML --progress-bar off | ||
pip install -r requirements.txt --progress-bar off | ||
- name: run black | ||
run: |- | ||
black --version | ||
ls -la | ||
black --exclude '/(\.eggs|\.git|\.hg|\.mypy_cache|\.nox|\.tox|\.venv|_build|buck-out|build|dist)' habitat_llm/ dataset_generation/ --diff | ||
black --exclude '/(\.eggs|\.git|\.hg|\.mypy_cache|\.nox|\.tox|\.venv|_build|buck-out|build|dist)' habitat_llm/ dataset_generation/ --check | ||
- name: run isort | ||
run: |- | ||
isort --version | ||
isort habitat_llm/. dataset_generation/. --diff | ||
isort habitat_llm/. dataset_generation/. --check-only | ||
install_and_test_ubuntu: | ||
runs-on: 4-core-ubuntu-gpu-t4 | ||
defaults: | ||
run: | ||
shell: bash -el {0} | ||
steps: | ||
- uses: actions/[email protected] | ||
with: | ||
path: "./partnr-planner" | ||
- uses: "./partnr-planner/.github/actions/install_ubuntu_deps" | ||
- uses: "./partnr-planner/.github/actions/install_ubuntu_gpu_deps" | ||
- name: Install pytorch | ||
run: |- | ||
export PATH=$HOME/miniconda/bin:/usr/local/cuda/bin:$PATH | ||
conda activate partnr | ||
conda install -y pytorch==2.4.1 torchvision==0.19.1 torchaudio==2.4.1 pytorch-cuda=12.4 -c pytorch -c nvidia | ||
echo "Validating Pytorch Installation" | ||
# Check that pytorch is installed with CUDA. | ||
python -c 'import torch; torch.cuda.set_device(0)' | ||
- name: Install habitat-sim version tag | ||
run: |- | ||
#give cmake ownership to the runner for installation | ||
sudo chown runner -R /opt/cmake312/ | ||
#activate conda env | ||
export PATH=$HOME/miniconda/bin:/usr/local/cuda/bin:$PATH | ||
conda activate partnr | ||
conda install habitat-sim=0.3.2 withbullet headless -c conda-forge -c aihabitat -y | ||
- name: Download test data | ||
run: |- | ||
# Disable clone protection for git lfs | ||
export GIT_CLONE_PROTECTION_ACTIVE=false | ||
git --version | ||
git-lfs --version | ||
export PATH=$HOME/miniconda/bin:/usr/local/cuda/bin:$PATH | ||
conda init | ||
source ~/.bashrc | ||
conda activate partnr | ||
conda install -y gitpython git-lfs | ||
cd partnr-planner | ||
git lfs install | ||
# get the standard test assets from the downloader | ||
python -m habitat_sim.utils.datasets_download --uids ci_test_assets hab_spot_arm rearrange_task_assets hab3_bench_assets --data-path data/ --no-replace --no-prune | ||
ls -la data/scene_datasets/habitat-test-scenes/ | ||
ln -s versioned_data/hab3_bench_assets/humanoids/ data/humanoids | ||
#TODO: replace these specific downloads with dataset downloader calls with next version update | ||
# Get HSSD mini dataset and OVMM_objects for testing | ||
git clone https://huggingface.co/datasets/ai-habitat/OVMM_objects data/objects_ovmm --recursive | ||
cd data/objects_ovmm | ||
git lfs pull | ||
cd ../.. | ||
git clone https://huggingface.co/datasets/ai-habitat/hssd-partnr-ci data/versioned_data/hssd-partnr-ci | ||
cd data/versioned_data/hssd-partnr-ci | ||
git lfs pull | ||
cd ../../.. | ||
ln -s versioned_data/hssd-partnr-ci data/hssd-partnr-ci | ||
# Get skills and the episode dataset for testing | ||
git clone --single-branch --branch ci https://huggingface.co/datasets/ai-habitat/partnr_episodes data/versioned_data/partnr_episodes | ||
cd data/versioned_data/partnr_episodes | ||
git lfs pull | ||
cd ../../.. | ||
#post process symblinking for convenience | ||
cd data | ||
# Create a folder for skills | ||
ln -s versioned_data/partnr_episodes/checkpoints models | ||
# Create a folder for RAG dataset to set RAG | ||
ln -s versioned_data/partnr_episodes/test_rag test_rag | ||
# Create a folder for episode datasets | ||
mkdir -p datasets | ||
ln -s ../versioned_data/partnr_episodes datasets/partnr_episodes | ||
cd .. | ||
- name: Install submodules and partnr | ||
run: |- | ||
#give cmake ownership to the runner for installation | ||
sudo chown runner -R /opt/cmake312/ | ||
#activate conda env | ||
export PATH=$HOME/miniconda/bin:/usr/local/cuda/bin:$PATH | ||
conda activate partnr | ||
cd partnr-planner | ||
#clone submodules | ||
git submodule sync | ||
git submodule update --init --recursive | ||
# Install submodules | ||
pip install -e third_party/habitat-lab/habitat-lab | ||
pip install -e third_party/habitat-lab/habitat-baselines | ||
pip install -e third_party/transformers-CFG | ||
# Install requirements | ||
pip install -r requirements.txt | ||
# install the library | ||
pip install -e . | ||
- name: run tests | ||
run: |- | ||
export PATH=$HOME/miniconda/bin:/usr/local/cuda/bin:$PATH | ||
conda activate partnr | ||
cd partnr-planner | ||
python -m pytest habitat_llm/tests | ||
python -m pytest dataset_generation/tests | ||
#NOTE: use the below to debug with ssh: simply move this "job" just before the crashing job to intercept the workflow | ||
#- name: Debugging with tmate | ||
# uses: mxschmitt/[email protected] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters