Load more than one model (default is 1):
OLLAMA_MAX_LOADED_MODELS=3
Force Vulkan support (default is 0 for OLLAMA_VULKAN):
OLLAMA_VULKAN=1
# Disable AMD ROCm.
HIP_VISIBLE_DEVICES=-1
ROCR_VISIBLE_DEVICES=-1
# Disable NVIDIA CUDA.
CUDA_VISIBLE_DEVICES=-1
https://github.com/ollama/ollama/issues/13589
Run Ollama with ROCm on unsupported AMD GPUs:
- Although AMD users can use
HSA_OVERRIDE_GFX_VERSION, it is complicated to find the correct version. The official Ollama recommendation is to use Vulkan instead.
Load more than one model (default is
1):Force Vulkan support (default is
0forOLLAMA_VULKAN):https://github.com/ollama/ollama/issues/13589Run Ollama with ROCm on unsupported AMD GPUs:
HSA_OVERRIDE_GFX_VERSION, it is complicated to find the correct version. The official Ollama recommendation is to use Vulkan instead.