Skip to content

Commit a903546

Browse files
docs: Update GPU table column headers and clarify runpodctl GPU ID requirements (#459)
Co-authored-by: promptless[bot] <179508745+promptless[bot]@users.noreply.github.com>
1 parent e5ec87c commit a903546

File tree

3 files changed

+5
-5
lines changed

3 files changed

+5
-5
lines changed

references/gpu-types.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ For information on pricing, see [GPU pricing](https://www.runpod.io/gpu-instance
1010
This table lists all GPU types available on Runpod:
1111
{/* Table last generated: 2025-08-23 */}
1212

13-
| GPU Name | GPU ID | Memory (GB) |
13+
| GPU ID | Display Name | Memory (GB) |
1414
|---------------------------------------------------|--------------------------|---------------|
1515
| AMD Instinct MI300X OAM | MI300X | 192 |
1616
| NVIDIA A100 80GB PCIe | A100 PCIe | 80 |

runpodctl/reference/runpodctl-create-pod.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ Create a Pod with 2 RTX 4090 GPUs in the Secure Cloud with a custom container im
1616
```sh
1717
runpodctl create pod \
1818
--name "my-training-pod" \
19-
--gpuType "RTX 4090" \
19+
--gpuType "NVIDIA GeForce RTX 3090" \
2020
--gpuCount 2 \
2121
--secureCloud \
2222
--imageName "runpod/pytorch:2.0.1-py3.10-cuda11.8.0-devel" \
@@ -31,7 +31,7 @@ A custom name for your Pod to make it easy to identify and reference.
3131
</ResponseField>
3232

3333
<ResponseField name="--gpuType" type="string">
34-
The GPU type to use for the Pod (e.g., `RTX 4090`, `A100 80GB`, `H100 SXM`). Use the GPU ID from the [GPU types reference](/references/gpu-types) table to specify the GPU type.
34+
The GPU type to use for the Pod (e.g., `NVIDIA GeForce RTX 4090`, `NVIDIA B200`, `NVIDIA L40S`). Use the GPU ID (long form) from the [GPU types reference](/references/gpu-types) table to specify the GPU type.
3535
</ResponseField>
3636

3737
<ResponseField name="--gpuCount" type="integer" default={1}>

runpodctl/reference/runpodctl-create-pods.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ Create 3 identical Pods with the name "training-worker" in the Secure Cloud:
1717
runpodctl create pods
1818
--name "training-worker" \
1919
--podCount 3 \
20-
--gpuType "A100 80GB" \
20+
--gpuType "NVIDIA GeForce RTX 3090" \
2121
--gpuCount 1 \
2222
--secureCloud \
2323
--imageName "runpod/pytorch:2.0.1-py3.10-cuda11.8.0-devel"
@@ -34,7 +34,7 @@ The number of Pods to create.
3434
</ResponseField>
3535

3636
<ResponseField name="--gpuType" type="string">
37-
The GPU type to use for the Pod (e.g., `RTX 4090`, `A100 80GB`, `H100 SXM`). Use the GPU ID from the [GPU types reference](/references/gpu-types) table to specify the GPU type.
37+
The GPU type to use for the Pods (e.g., `NVIDIA GeForce RTX 4090`, `NVIDIA B200`, `NVIDIA L40S`). Use the GPU ID (long form) from the [GPU types reference](/references/gpu-types) table to specify the GPU type.
3838
</ResponseField>
3939

4040
<ResponseField name="--gpuCount" type="integer" default={1}>

0 commit comments

Comments
 (0)