Skip to content

Commit

Permalink
Update RHOAI docs to reflect changes coming Jan 7
Browse files Browse the repository at this point in the history
  • Loading branch information
dystewart committed Dec 11, 2024
1 parent 4aba4fb commit 1b6c08c
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ On the Create workbench page, complete the following information.

- Notebook image (Image selection)

- Deployment size (Container size and Number of GPUs)
- Deployment size (Container size, Type and Number of GPUs)

- Environment variables

Expand All @@ -82,7 +82,7 @@ On the Create workbench page, complete the following information.
resources, including CPUs and memory. Each container size comes with pre-configured
CPU and memory resources.

Optionally, you can specify the desired **Number of GPUs** depending on the
Optionally, you can specify the desired **Accelerator** and **Number of Accelerators** (GPUs), depending on the
nature of your data analysis and machine learning code requirements. However,
this number should not exceed the GPU quota specified by the value of the
"**OpenShift Request on GPU Quota**" attribute that has been approved for
Expand All @@ -97,7 +97,7 @@ Once you have entered the information for your workbench, click **Create**.
![Fill Workbench Information](images/tensor-flow-workbench.png)

For our example project, let's name it "Tensorflow Workbench". We'll select the
**TensorFlow** image, choose a **Deployment size** of **Small**, **Number of GPUs**
**TensorFlow** image, choose a **Deployment size** of **Small**, **Accelerator** of **NVIDIA A100 GPU**, **Number of Accelerators**
as **1** and allocate a **Cluster storage** space of **1GB**.

!!! info "More About Cluster Storage"
Expand Down

0 comments on commit 1b6c08c

Please sign in to comment.