Skip to content

[programming][machine_learning] Ollama environment variables #1206

@LukeShortCloud

Description

@LukeShortCloud

Load more than one model (default is 1):

OLLAMA_MAX_LOADED_MODELS=3

Force Vulkan support (default is 0 for OLLAMA_VULKAN):

OLLAMA_VULKAN=1
# Disable AMD ROCm.
HIP_VISIBLE_DEVICES=-1
ROCR_VISIBLE_DEVICES=-1
# Disable NVIDIA CUDA.
CUDA_VISIBLE_DEVICES=-1

https://github.com/ollama/ollama/issues/13589

Run Ollama with ROCm on unsupported AMD GPUs:

Metadata

Metadata

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions