█████╗ ██╗ ██╗ ██████╗ ███████╗██╗ ██████╗ ██╗ ██╗
██╔══██╗██║ ██╔╝██╔═══██╗██╔════╝██║ ██╔═══██╗██║ ██║
███████║█████╔╝ ██║ ██║█████╗ ██║ ██║ ██║██║ █╗ ██║
██╔══██║██╔═██╗ ██║ ██║██╔══╝ ██║ ██║ ██║██║███╗██║
██║ ██║██║ ██╗╚██████╔╝██║ ███████╗╚██████╔╝╚███╔███╔╝
╚═╝ ╚═╝╚═╝ ╚═╝ ╚═════╝ ╚═╝ ╚══════╝ ╚═════╝ ╚══╝╚══╝
AkôFlow is an open-source middleware for orchestrating and executing container-based scientific workflows across heterogeneous environments. It was originally developed within the e-Science Research Group at the Institute of Computing, Fluminense Federal University (UFF).
Although initially focused on Kubernetes-based workloads, AkôFlow has evolved to support general containerized execution across multiple infrastructures.
- Operating System: Linux, macOS or WSL2 (Windows Subsystem for Linux)
- Docker: Install Docker
- kubectl: Install kubectl
- Kubernetes Cluster: One of the following:
- Kind (local)
- Docker Desktop Kubernetes (enable Kubernetes in settings)
- Cloud providers (e.g., EKS, GKE, AKS)
Run the following command to install AkôFlow:
curl -fsSL https://akoflow.com/run | bash
AkôFlow will be available at http://localhost:8080
.
-
Access the Web Interface
Open your browser and go to the AkôFlow web interface athttp://localhost:8080
. -
Connect to a Kubernetes Cluster
AkôFlow requires a Kubernetes runtime. You can use one of the following options:
- Kind (local clusters)
- Docker Desktop (enable Kubernetes in settings)
- Cloud Providers:
- Kubeadm (for on-premise clusters)
- Apply AkôFlow Resources
Deploy the required AkôFlow resources to your Kubernetes cluster by running the following command:
kubectl apply -f https://raw.githubusercontent.com/UFFeScience/akoflow/main/pkg/server/resource/akoflow-dev-dockerdesktop.yaml
- Generate a Service Account Token
Create a token for the AkôFlow service account with the following command:
kubectl create token akoflow-server-sa --duration=800h --namespace=akoflow
- Set Environment Variables
Configure the environment variables for AkôFlow to connect to your Kubernetes cluster:
export K8S_API_SERVER_HOST=https://<your-k8s-api-endpoint>
export K8S_API_SERVER_TOKEN=<your-generated-token>
Replace <your-k8s-api-endpoint>
with your Kubernetes API server endpoint and <your-generated-token>
with the token generated in the previous step.
AkôFlow Demonstration (In Portuguese)
- Kubernetes (public cloud providers: AWS, GCP, Azure, etc.)
- Singularity (for local or HPC isolated execution)
- SDumont supercomputer (LNCC - Brazil)
- D.Sc. Daniel de Oliveira — Research Advisor
- Wesley Ferreira - @ovvesley — Maintainer - IC/UFF
- Liliane Kunstmann - COPPE/UFRJ
- Debora Pina - COPPE/UFRJ
- Raphael Garcia — IC/UFF
- Yuri Frota — IC/UFF
- Marcos Bedo — IC/UFF
- Aline Paes — IC/UFF
- Luan Teylo — INRIA/Université de Bordeaux
-
Ferreira, W., Kunstmann, L., Paes, A., Bedo, M., & de Oliveira, D. (2024, October).
AkôFlow
: um Middleware para execução de Workflows científicos em múltiplos ambientes conteinerizados. In 39th Simpósio Brasileiro de Banco de Dados (SBBD) (pp. 27-39). SBC. (DOI:10.5753/sbbd.2024.241126.) -
Ferreira, W., Kunstmann, L., Garcia R., Bedo, M., & de Oliveira, D. (2025, October). Plug and Flow: Execução de Workflows Científicos em Contêineres com o Middleware
AkôFlow
. In 40th Simpósio Brasileiro de Banco de Dados (SBBD). (Paper just accepted)
AkôFlow originated as a final undergraduate project and has since expanded with broader contributions and integrations. It continues to serve both academic and industrial workflow execution scenarios.