Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
46fd341
Add Kubernetes backend support in segmentation algorithms
SimoneBendazzoli93 Oct 18, 2025
e9eece3
Update README to include Kubernetes support details for remote algori…
SimoneBendazzoli93 Oct 18, 2025
a538d42
Add Kubernetes support in BraTSAlgorithm for inference runs
SimoneBendazzoli93 Oct 18, 2025
cead76b
Add Kubernetes dependency in pyproject.toml
SimoneBendazzoli93 Oct 18, 2025
39ccaff
Update README to reflect parameter name change for PVC mount path in …
SimoneBendazzoli93 Oct 19, 2025
573842e
Add Kubernetes job management functionality in kubernetes.py
SimoneBendazzoli93 Oct 19, 2025
e99a837
Remove debug print statement for base folder name in _download_folder…
SimoneBendazzoli93 Oct 19, 2025
e63c443
Merge branch '114-singularity-integration' into 113-kubernetes-integr…
SimoneBendazzoli93 Oct 20, 2025
ad42a55
Merge branch 'main' into 113-kubernetes-integration
SimoneBendazzoli93 Nov 3, 2025
a59b90e
Add comprehensive unit tests for Kubernetes integration in `test_kube…
SimoneBendazzoli93 Nov 3, 2025
90a16c5
Update `_download_folder_from_pod` function to use absolute path for …
SimoneBendazzoli93 Nov 3, 2025
cf340a2
Add optional `kubernetes_kwargs` parameter to `BraTSAlgorithm` for en…
SimoneBendazzoli93 Nov 3, 2025
f399c7e
Autoformat with black
brainless-bot[bot] Nov 3, 2025
a78638b
Update dependencies in `pyproject.toml` and `poetry.lock` to include …
SimoneBendazzoli93 Nov 3, 2025
bef0219
Merge branch '113-kubernetes-integration' of https://github.com/Brain…
SimoneBendazzoli93 Nov 3, 2025
3527cf1
Refactor Kubernetes integration: remove unused imports, adjust output…
SimoneBendazzoli93 Nov 3, 2025
a7c9761
Update `pyproject.toml` to correct dependency format for `kubernetes`…
SimoneBendazzoli93 Nov 3, 2025
27950c4
Autoformat with black
brainless-bot[bot] Nov 3, 2025
2948475
Enhance `_download_folder_from_pod` function with detailed docstring,…
SimoneBendazzoli93 Nov 3, 2025
14d5083
Merge branch '113-kubernetes-integration' of https://github.com/Brain…
SimoneBendazzoli93 Nov 3, 2025
09fa7a2
Autoformat with black
brainless-bot[bot] Nov 3, 2025
b1d42e4
Update content hash in `poetry.lock` to reflect changes in dependencies.
SimoneBendazzoli93 Nov 3, 2025
eff0050
Merge branch 'main' into 113-kubernetes-integration
neuronflow Nov 5, 2025
fa1e555
Merge branch 'main' into 113-kubernetes-integration
neuronflow Nov 5, 2025
978ca74
Refactor Kubernetes job resource allocation and mount paths
SimoneBendazzoli93 Nov 10, 2025
0f5d10f
Refactor variable names for clarity in Kubernetes job volume mounts
SimoneBendazzoli93 Nov 10, 2025
4143fe4
Update Kubernetes backend validation in BraTSAlgorithm
SimoneBendazzoli93 Nov 10, 2025
0a26019
Refactor command argument return type and improve pod selection logic…
SimoneBendazzoli93 Nov 10, 2025
c1eea9b
Enhance Kubernetes execution instructions in README and update device…
SimoneBendazzoli93 Nov 10, 2025
7c1e5df
Autoformat with black
brainless-bot[bot] Nov 10, 2025
8f457e7
Update run_job function to accept Union types for data_path and outpu…
SimoneBendazzoli93 Nov 10, 2025
20eae49
Refactor Kubernetes backend validation and improve logging messages
SimoneBendazzoli93 Nov 10, 2025
915a556
Autoformat with black
brainless-bot[bot] Nov 10, 2025
379dace
Refactor backend validation and path handling in BraTSAlgorithm and r…
SimoneBendazzoli93 Nov 10, 2025
654bbff
Merge branch '113-kubernetes-integration' of https://github.com/Brain…
SimoneBendazzoli93 Nov 10, 2025
bdd5288
Refactor Kubernetes job functions to improve clarity and add new util…
SimoneBendazzoli93 Nov 10, 2025
23cc758
Enhance Kubernetes job functions with improved documentation and addi…
SimoneBendazzoli93 Nov 10, 2025
56215a3
Autoformat with black
brainless-bot[bot] Nov 10, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 33 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,39 @@ segmenter.infer_single(
)
```

## Kubernetes Support
BraTS orchestrator also supports Kubernetes to run the algorithms remotely, as an alternative to local execution with Docker or Singularity.
To use Kubernetes for execution, the orchestrator will automatically use your kubeconfig file from the default location (`~/.kube/config`). If your kubeconfig file is not in the default location, set the `KUBECONFIG` environment variable to the location of your kubeconfig file:
```bash
export KUBECONFIG=/path/to/kubeconfig
```
Then, specify the backend to use as `kubernetes` (or the corresponding enum value `Backends.KUBERNETES`) when running the inference:
```python
from brats.constants import Backends
segmenter.infer_single(
t1c="path/to/t1c.nii.gz",
output_file="path/to/segmentation.nii.gz",
backend=Backends.KUBERNETES
)
```
By default, as shown above, the algorithm runs in the default Kubernetes namespace. It uses the default StorageClass and automatically creates a 1Gi PersistentVolumeClaim (PVC) to manage input and output data. If needed, you can customize settings such as the namespace, PVC name, storage size, storage class, job name, and mount path by providing related keyword arguments to the `infer_single` method. The `data_mount_path` parameter determines where the PVC will be mounted inside the Pod.
When using Kubernetes, the algorithm is executed inside a Kubernetes Job. Input data is first uploaded to a PersistentVolume, which is mounted into the Pod running the job. After the algorithm finishes running in the Pod, the output data is transferred back from the cluster to your local machine.
```python
segmenter.infer_single(
t1c="path/to/t1c.nii.gz",
output_file="path/to/segmentation.nii.gz",
backend=Backends.KUBERNETES,
kubernetes_kwargs={
"namespace": "brats",
"pvc_name": "brats-iwydw55ej7qm-pvc",
"pvc_storage_size": "2Gi",
"pvc_storage_class": "brats-pvc-storage-class",
"job_name": "brats-oxh24nu4dhk9-job",
"data_mount_path": "/data",
}
)
```

## Available Algorithms and Usage

> [!IMPORTANT]
Expand Down
3 changes: 3 additions & 0 deletions brats/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,9 @@ class Backends(str, Enum):
SINGULARITY = "singularity"
"""Run the algorithms using Singularity containers."""

KUBERNETES = "kubernetes"
"""Run the algorithms using Kubernetes Jobs."""


class Task(str, Enum):
"""Available tasks."""
Expand Down
30 changes: 27 additions & 3 deletions brats/core/brats_algorithm.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@

from brats.core.docker import run_container as run_docker_container
from brats.core.singularity import run_container as run_singularity_container
from brats.core.kubernetes import run_job as run_kubernetes_job
from brats.utils.algorithm_config import load_algorithms
from brats.constants import OUTPUT_NAME_SCHEMA, Algorithms, Task, Backends
from brats.utils.data_handling import InferenceSetup
Expand Down Expand Up @@ -163,6 +164,7 @@ def _get_backend_runner(self, backend: Backends) -> Optional[Callable]:
backend_dispatch = {
Backends.DOCKER: run_docker_container,
Backends.SINGULARITY: run_singularity_container,
Backends.KUBERNETES: run_kubernetes_job,
}
runner = backend_dispatch.get(backend, None)
return runner
Expand All @@ -173,6 +175,7 @@ def _infer_single(
output_file: Path | str,
log_file: Optional[Path | str] = None,
backend: Backends = Backends.DOCKER,
kubernetes_kwargs: Optional[Dict] = None,
) -> None:
"""
Perform a single inference run with the provided inputs and save the output in the specified file.
Expand All @@ -181,7 +184,8 @@ def _infer_single(
inputs (dict[str, Path | str]): Input Images for the task
output_file (Path | str): File to save the output
log_file (Optional[Path | str], optional): Log file with extra information. Defaults to None.
backend (Backends | str, optional): Backend to use for inference. Defaults to Backends.DOCKER.
backend (Backends, optional): Backend to use for inference. Defaults to Backends.DOCKER.
kubernetes_kwargs (Optional[Dict], optional): Optional keyword arguments for Kubernetes Backend. Defaults to None.
"""
with InferenceSetup(log_file=log_file) as (tmp_data_folder, tmp_output_folder):
logger.info(f"Performing single inference")
Expand All @@ -199,13 +203,22 @@ def _infer_single(
runner = self._get_backend_runner(backend)
if runner is None:
raise ValueError(f"Unsupported backend: {backend}")
runner(
runner_kwargs = dict(
algorithm=self.algorithm,
data_path=tmp_data_folder,
output_path=tmp_output_folder,
cuda_devices=self.cuda_devices,
force_cpu=self.force_cpu,
)
if kubernetes_kwargs is not None:
logger.debug(f"Adding Kubernetes kwargs: {kubernetes_kwargs}")
if backend != Backends.KUBERNETES:
raise ValueError(
"Kubernetes kwargs can only be used with the Kubernetes backend."
)
for key, value in kubernetes_kwargs.items():
runner_kwargs[key] = value
runner(**runner_kwargs)
self._process_single_output(
tmp_output_folder=tmp_output_folder,
subject_id=subject_id,
Expand All @@ -219,6 +232,7 @@ def _infer_batch(
output_folder: Path | str,
log_file: Optional[Path | str] = None,
backend: Backends = Backends.DOCKER,
kubernetes_kwargs: Optional[Dict] = None,
):
"""Perform a batch inference run with the provided inputs and save the outputs in the specified folder.

Expand All @@ -227,6 +241,7 @@ def _infer_batch(
output_folder (Path | str): Folder to save the outputs
log_file (Optional[Path | str], optional): Log file with extra information. Defaults to None.
backend (Backends, optional): Backend to use for inference. Defaults to Backends.DOCKER.
kubernetes_kwargs (Optional[Dict], optional): Optional keyword arguments for Kubernetes Backend. Defaults to None.
"""
with InferenceSetup(log_file=log_file) as (tmp_data_folder, tmp_output_folder):

Expand All @@ -246,14 +261,23 @@ def _infer_batch(
if runner is None:
raise ValueError(f"Unsupported backend: {backend}")
# run inference in container
runner(
runner_kwargs = dict(
algorithm=self.algorithm,
data_path=tmp_data_folder,
output_path=tmp_output_folder,
cuda_devices=self.cuda_devices,
force_cpu=self.force_cpu,
internal_external_name_map=internal_external_name_map,
)
if kubernetes_kwargs is not None:
logger.debug(f"Adding Kubernetes kwargs: {kubernetes_kwargs}")
if backend != Backends.KUBERNETES:
raise ValueError(
"Kubernetes kwargs can only be used with the Kubernetes backend."
)
for key, value in kubernetes_kwargs.items():
runner_kwargs[key] = value
runner(**runner_kwargs)

self._process_batch_output(
tmp_output_folder=tmp_output_folder,
Expand Down
Loading
Loading