diff --git a/docs/access-control.md b/docs/access-control.md index 924cb82..2de58c6 100644 --- a/docs/access-control.md +++ b/docs/access-control.md @@ -8,13 +8,13 @@ Workspaces are organizational units that contain computations and users. Each wo ### Workspace Roles -| Role | Key Permissions | -| ------------------ | -------------------------------------------------------- | -| **Administrator** | Full workspace control, user management, computation creation | -| **Manager** | Role management, user assignment | -| **Member** | Read/update workspace resources | -| **Computation Creator** | Create and manage computations | -| **CVM Manager** | Create and remove CVMs (Compute Virtual Machines) | +| Role | Key Permissions | +| ------------------------| --------------------------------------------------------------| +| **Administrator** | Full workspace control, user management, computation creation | +| **Manager** | Role management, user assignment | +| **Member** | Read/update workspace resources | +| **Computation Creator** | Create and manage computations | +| **CVM Manager** | Create and remove CVMs (Compute Virtual Machines) | ### Workspace Permissions Explained @@ -33,15 +33,15 @@ Computations have the most granular access control system, designed to support c ### Computation Roles -| Role | Permissions | Use Case | -| ------------------- | ------------------------------------- | --------------------------- | -| **Administrator** | Owner, View, Edit, Run | Full computation control | -| **Editor** | View, Edit | Modify computation settings | -| **Runner** | View, Run | Execute computations | -| **Viewer** | View | Monitor computation status | -| **Dataset Provider** | View, Edit, Provide Data | Supply input datasets | -| **Algorithm Provider** | View, Edit, Provide Algorithm | Contribute algorithms | -| **Result Consumer** | View, Edit, Consume Results | Access computation outputs | +| Role | Permissions | Use Case | +| ---------------------- | ------------------------------------- | --------------------------- | +| **Administrator** | Owner, View, Edit, Run | Full computation control | +| **Editor** | View, Edit | Modify computation settings | +| **Runner** | View, Run | Execute computations | +| **Viewer** | View | Monitor computation status | +| **Dataset Provider** | View, Edit, Provide Data | Supply input datasets | +| **Algorithm Provider** | View, Edit, Provide Algorithm | Contribute algorithms | +| **Result Consumer** | View, Edit, Consume Results | Access computation outputs | ### Computation Permissions Explained diff --git a/docs/computations.md b/docs/computations.md index 1637200..80ab8ca 100644 --- a/docs/computations.md +++ b/docs/computations.md @@ -9,7 +9,7 @@ The computations service is the core component of Prism AI that enables secure, A computation in Prism represents a secure collaborative AI workflow that includes: - **Algorithm**: The AI model or processing logic to be executed -- **Datasets**: Input data from one or more providers +- **Datasets**: Input data from one or more providers - **Participants**: Users with specific roles (data providers, algorithm providers, result consumers) - **Security Configuration**: TLS/mTLS settings and attestation policies - **Execution Environment**: Virtual machine configuration for secure processing @@ -28,13 +28,16 @@ Before creating a computation, ensure you have: ### Creating Your First Computation 1. **Navigate to Computations** + - Access the computations page from the sidebar - Click "Create New Computation" 2. **Configure Basic Details** + - Enter computation name and description 3. **Set Security Configuration** + - Configure agent security settings (TLS/mTLS/aTLS) - Upload necessary certificates if using TLS @@ -59,7 +62,7 @@ Access the computation creation interface through the main navigation: #### Import from File -Prism supports bulk computation creation through file imports: +You can import previously exported computations or create new computations from formatted files. **Supported Formats:** @@ -68,42 +71,244 @@ Prism supports bulk computation creation through file imports: **Import Process:** -1. Navigate to the computations page -2. Click the import button -3. Select your JSON or CSV file -4. Verify user IDs are valid and correspond to registered workspace users +1. **Navigate to the computations page** in the Prism UI +2. **Click the "Import" button** (usually found in the top toolbar) +3. **Select your computation export file** (.json format) +4. **Review the import preview** - Prism will show you: + - Which computations will be created + - Which assets (datasets and algorithms) are required + - Any missing dependencies +5. **Verify asset availability** - Ensure all referenced datasets and algorithms already exist in your workspace +6. **Confirm the import** to create the computation(s) ![Import computations](../static/img/ui/import_computation.png) -**Sample JSON Format:** +> **Important:** Before importing, ensure all referenced datasets and algorithms are in your workspace. If some are missing, Prism will issue warnings and create all possible resources from the available data. + +**Understanding Computation Data Formats** +Prism uses two distinct data formats for different purposes: + +#### 1. Export/Import Format (For Storage and Transfer) + +This is the format you'll work with when using the Import/Export features in the Prism UI. When you click "Export," Prism creates this comprehensive JSON file that you can save, share, or use to restore the computation later. + +The export/import format is a complete backup file for your work in Prism. It’s used when you want to save a copy, share your work, move it to another system, or create several similar projects. This file includes everything — your settings, data, permissions, and history — so Prism can fully recreate your work later. + +![Import computations](../static/img/ui/computation_import_export.png) + +This detailed table shows all the fields in an export file and what they mean: + +| Field | Type | Description | +|----------------------------------------|-------------------|----------------------------------------------------------------| +| computation | object | Core computation metadata | +| computation.name | string | The name of the computation | +| computation.description | string | A short explanation of what the computation does | +| computation.start_time | string (ISO 8601) | Computation start timestamp | +| computation.end_time | string (ISO 8601) | Computation end timestamp | +| computation.agent_config | object | Agent configuration settings | +| computation.agent_config.log_level | string | Logging verbosity level | +| computation.agent_config.cert_file | string | Path to certificate file | +| computation.agent_config.server_key | string | Server private key path | +| computation.agent_config.server_ca_file| string | Server CA certificate path | +| computation.agent_config.client_ca_file| string | Client CA certificate path | +| computation.agent_config.attested_tls | boolean | Whether attested TLS is enabled | +| computation.created_at | string (ISO 8601) | Computation creation timestamp | +| roles | array | List of role definitions for access control | +| roles[].role_name | string | Name of the role (e.g., "owner", "viewer") | +| roles[].actions | array of strings | Permissions granted to this role | +| roles[].members | array of strings | User IDs assigned to this role | +| assets | array | List of all assets (datasets and algorithms) | +| assets[].file_name | string | Original filename of the asset | +| assets[].id | string (UUID) | Unique identifier for the asset | +| assets[].UserID | string (UUID) | ID of the user who owns the asset | +| assets[].description | string | Description of the asset | +| assets[].asset_type | string | Type of asset ("algorithm" or "dataset") | +| assets[].asset | string (base64) | Base64-encoded content of the asset | +| assets[].created_at | string (ISO 8601) | Asset creation timestamp | +| assets[].Checksum | string (hex) | SHA-256 checksum of the asset | +| assets[].mime_type | string | MIME type of the asset | +| assets[].Computations | array | List of computations using this asset | +| assets[].UserKey | string | User's public key (if applicable) | +| asset_links | array | Links between assets and computations | +| asset_links[].asset_id | string (UUID) | ID of the asset | +| asset_links[].computation_id | string (UUID) | ID of the associated computation | + +**Sample Import/Export Computation JSON:** +This example shows what an imported/exported computation looks like. Notice how it only contains workspace data + +```json +[ + { + "computation": { + "name": "User1", + "description": "testing", + "start_time": "0001-01-01T00:00:00Z", + "end_time": "0001-01-01T00:00:00Z", + "agent_config": { + "log_level": "", + "cert_file": "", + "server_key": "", + "server_ca_file": "", + "client_ca_file": "", + "attested_tls": false + }, + "created_at": "2025-11-12T14:00:01.654562Z" + }, + "roles": [ + { + "role_name": "algo_provider", + "actions": [ + "view", + "algo_provider" + ], + "members": null + }, + { + "role_name": "dataset_provider", + "actions": [ + "view", + "dataset_provider" + ], + "members": null + }, + { + "role_name": "owner", + "actions": [ + "view", + "edit", + "run", + "administrator", + "dataset_provider", + "algo_provider", + "result_consumer" + ], + "members": [ + "10c209f4-6fba-4447-9194-3e61e4c2bb11" + ] + }, + { + "role_name": "viewer", + "actions": [ + "view" + ], + "members": null + }, + { + "role_name": "editor", + "actions": [ + "view", + "edit" + ], + "members": null + }, + { + "role_name": "runner", + "actions": [ + "view", + "run" + ], + "members": null + }, + { + "role_name": "result_consumer", + "actions": [ + "view", + "result_consumer" + ], + "members": null + } + ], + "assets": [ + { + "file_name": "addition", + "id": "a4b34c5f-f400-4288-9753-f44cb708fc1b", + "UserID": "10c209f4-6fba-4447-9194-3e61e4c2bb11", + "description": "Algo", + "asset_type": "algorithm", + "asset": "MIICIjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAx5NXEAwBVVxCUaWmo3Vv...", + "created_at": "2025-11-12T14:04:34.735495Z", + "Checksum": "a7e107aa899725cc81a137a0d0b61163cf46b4721d459f2b212dbb9f65e7d57c", + "mime_type": "text/plain", + "Computations": null, + "UserKey": "" + }, + { + "file_name": "iris.csv", + "id": "5e99ab96-d453-4e8d-b5c1-18488afc124c", + "UserID": "10c209f4-6fba-4447-9194-3e61e4c2bb11", + "description": "Dataset", + "asset_type": "dataset", + "asset": "MIICIjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAx5NXEAwBVVxCUaWmo3Vv...", + "created_at": "2025-11-12T14:03:50.719343Z", + "Checksum": "a9a96ff672cde7f6b2badcc4eb13b95afe59255650abfcbd9f73d34fc61480ad", + "mime_type": "text/csv", + "Computations": null, + "UserKey": "" + } + ], + "asset_links": [ + { + "asset_id": "a4b34c5f-f400-4288-9753-f44cb708fc1b", + "computation_id": "" + }, + { + "asset_id": "5e99ab96-d453-4e8d-b5c1-18488afc124c", + "computation_id": "" + } + ] + } +] +``` + +#### 2. Manifest Format (For Runtime Verification) + +The manifest is automatically generated by Prism when you run a computation. You typically don't need to create or edit manifests manually. The agent uses it behind the scenes to verify that the correct, unmodified datasets and algorithms are being used. + +![Import computations](../static/img/ui/computation_agent_manifest.png) + +The manifest is a lightweight reference file that the Prism agent downloads and uses during computation execution. It is used by PCR 16 for cryptographic verification to ensure data integrity and authenticity. This verification is done on cocos-cli as documented [here](https://docs.cocos.ultraviolet.rs/cli#subcommand-policy-extend) + +**Manifest Structure:** + +| Field | Type | Description | +|------------------------|-------------------|--------------------------------------------------| +| name | string | The name of the computation | +| description | string | A short explanation of what the computation does | +| datasets | array | List of datasets involved in the computation | +| datasets[].hash | array of integers | SHA-256 hash of the dataset asset | +| datasets[].user_key | string (PEM) | Public key of the dataset provider | +| datasets[].filename | string | Original filename of the dataset | +| algorithm | object | Algorithm definition used in the computation | +| algorithm.hash | array of integers | SHA-256 hash of the algorithm asset | +| algorithm.user_key | string (PEM) | Public key of the algorithm provider | + +**Sample Manifest JSON:** + +This example shows what a manifest looks like. Notice how it only contains hashes (cryptographic fingerprints) and references, not the actual data: ```json { - "id": "185e61f4-2fd1-47c3-b8e7-1bf6a8466b79", - "name": "sample_computation", - "description": "Sample collaborative AI computation", - "owner": "f07b7716-2737-4228-9d80-d9df4ab5ee53", - "start_time": "0001-01-01T00:00:00Z", + "name": "PRISM", + "description": "prism", "datasets": [ { - "provider": "f07b7716-2737-4228-9d80-d9df4ab5ee53", - "hash": "171ae99ff0449d52cd37f824eec20f56d4efbe322e022e1df02a89eabc16209c" + "hash": [ + 14, 88, 157, 119, 113, 69, 18, 77, 253, 116, 179, 236, 28, 193, 208, + 202, 172, 66, 70, 24, 92, 237, 236, 227, 219, 112, 231, 179, 140, 25, + 192, 117 + ], + "user_key": "MIICIjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAx5NXEAwBVVxCUaWmo3Vv...", + "filename": "creditcard.csv" } ], "algorithm": { - "provider": "f07b7716-2737-4228-9d80-d9df4ab5ee53", - "hash": "9567a45920974a3261f9e897b3da7e49a391728f607f36f0ad6e8f5ec8a2041b" - }, - "result_consumers": ["f07b7716-2737-4228-9d80-d9df4ab5ee53"], - "agent_config": { - "log_level": "info", - "cert_file": "", - "server_key": "", - "server_ca_file": "", - "client_ca_file": "", - "attested_tls": false - }, - "backend_id": "9a8d67b6-9298-4393-81c6-8b7958a8cebf" + "hash": [ + 30, 169, 221, 205, 136, 158, 196, 245, 237, 157, 191, 60, 169, 197, 167, + 94, 189, 231, 220, 145, 23, 247, 114, 128, 228, 62, 220, 146, 35, 162, + 248, 6 + ], + "user_key": "MIICIjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAx5NXEAwBVVxCUaWmo3Vv..." + } } ``` @@ -164,7 +369,7 @@ Export computations for backup, sharing, or migration: **Use Cases:** - Creating computation templates -- Backup and disaster recovery +- Backup and disaster recovery - Migrating between environments - Sharing computation configurations @@ -220,14 +425,17 @@ Prism supports multiple TLS configurations to meet different security requiremen #### Configuration Steps 1. **Access Configuration** + - Select appropriate TLS configuration mode on the computation creation/update page 2. **Upload Certificates** (if required) + - Ensure all files are in PEM format - Verify certificate validity and expiration dates - Confirm proper file permissions 3. **Set Logging Level** + - Choose appropriate verbosity for your use case - Consider performance impact of debug logging @@ -238,12 +446,12 @@ Prism supports multiple TLS configurations to meet different security requiremen #### File Formats and Purposes -| File Type | Required For | Format | Purpose | -|-----------|-------------|---------|----------| -| Key File | TLS, mTLS | PEM-encoded private key | Agent authentication | -| Certificate File | TLS, mTLS | PEM-encoded certificate | Agent identity verification | -| Server CA File | mTLS only | PEM-encoded CA certificate | Server certificate verification | -| Client CA File | mTLS, maTLS | PEM-encoded CA certificate | Client certificate verification | +| File Type | Required For | Format | Purpose | +| ---------------- | ------------ | -------------------------- | ------------------------------- | +| Key File | TLS, mTLS | PEM-encoded private key | Agent authentication | +| Certificate File | TLS, mTLS | PEM-encoded certificate | Agent identity verification | +| Server CA File | mTLS only | PEM-encoded CA certificate | Server certificate verification | +| Client CA File | mTLS, maTLS | PEM-encoded CA certificate | Client certificate verification | #### Generating Certificates @@ -317,31 +525,31 @@ IP.1 = - Info (Recommended) - - Standard operational information - - Important events and milestones - - Balanced detail without performance impact - - **Best for**: Production environments + - Standard operational information + - Important events and milestones + - Balanced detail without performance impact + - **Best for**: Production environments - Debug - - Detailed operation information - - Extensive diagnostic data - - May impact performance - - **Best for**: Development and troubleshooting + - Detailed operation information + - Extensive diagnostic data + - May impact performance + - **Best for**: Development and troubleshooting - Warn - - Warning-level messages only - - Potentially harmful situations - - Minimal operational information - - **Best for**: Monitoring potential issues + - Warning-level messages only + - Potentially harmful situations + - Minimal operational information + - **Best for**: Monitoring potential issues - Error - - Critical issues only - - Error conditions and failures - - Minimal logging overhead - - **Best for**: Production with minimal logging requirements + - Critical issues only + - Error conditions and failures + - Minimal logging overhead + - **Best for**: Production with minimal logging requirements #### Best Practices @@ -360,16 +568,19 @@ Attested TLS provides hardware-backed attestation for TEE environments: 1. **Set aTLS Configuration** ![aTLS config](../static/img/ui/setatlsconfig.png) + - Select "Attested TLS" from the TLS Configuration dropdown - No certificate files required - Click "Close" to save 2. **Verify Configuration** ![Confirm aTLS](../static/img/ui/confirmatls.png) + - Update/create the computation - Verify aTLS is properly configured 3. **Run Computation** + - Create a CVM - Wait for VM provisioning to complete - Start the computation @@ -379,7 +590,8 @@ Attested TLS provides hardware-backed attestation for TEE environments: 4. **Download Attestation Policy** ![Download policy](../static/img/ui/download-policy-download.png) - - Download the attestation policy file from the cvm page + + - Download the attestation policy file from the cvm page (You can also use [CLI commands](https://docs.cocos.ultraviolet.rs/cli#command-policy)) to download the attestation policy. - This file contains expected values for attestation verification 5. **Configure CLI Environment** @@ -394,7 +606,7 @@ Attested TLS provides hardware-backed attestation for TEE environments: **Manual Measurement Calculation** (Optional) -For additional security verification, you can manually calculate and verify measurements: +For additional security verification, you can manually calculate and verify measurements. ```bash # Set paths to kernel and rootfs files @@ -408,9 +620,12 @@ LINE='"quiet console=null rootfstype=ramfs"' --ovmf $OVMF_CODE --kernel $KERNEL --initrd $INITRD --append "$LINE" # Update attestation policy with calculated measurement -./cocos-cli backend measurement +./cocos-cli policy measurement + ``` +More docs on CLI commands used: [sevsnpmeasure](https://docs.cocos.ultraviolet.rs/cli#command-sevsnpmeasure) and [measurement](https://docs.cocos.ultraviolet.rs/cli#subcommand-policy-measurement). + ### TLS and mTLS Configuration #### TLS Setup (Server Authentication) @@ -432,7 +647,7 @@ export AGENT_GRPC_SERVER_CA_CERTS= **UI Configuration:** -1. Select "Mutual TLS" from TLS Configuration dropdown +1. Select "Mutual TLS" from TLS Configuration dropdown 2. Upload all required certificate files: - Server certificate and private key - Client CA certificate @@ -460,7 +675,7 @@ export AGENT_GRPC_CLIENT_CERT= export AGENT_GRPC_CLIENT_KEY= export AGENT_GRPC_SERVER_CA_CERTS= -# aTLS settings +# aTLS settings export AGENT_GRPC_ATTESTED_TLS=true export AGENT_GRPC_ATTESTATION_POLICY= ``` @@ -495,23 +710,27 @@ export AGENT_GRPC_ATTESTATION_POLICY= - Computation must be properly configured - Target CVM must be available and ready -- Proper authentication configured (keys, certificates) +- Proper authentication configured (keys, certificates). + +_Note:_ You can generate keys using [cocos-cli](https://docs.cocos.ultraviolet.rs/cli/#command-keys), our command-line tool for computation management. #### Execution Process 1. **Initiate Execution** + - Click "Run" button on computation page - Select target virtual machine - Confirm execution parameters 2. **Monitor Progress** + - Track VM provisioning status - Monitor computation logs for progress - Check for any error conditions 3. **Handle Results** - Results available to designated consumers - - Download through CLI + - Download through [cocos-cli](https://docs.cocos.ultraviolet.rs/cli#command-result) - Verify result integrity #### VM Requirements @@ -525,28 +744,28 @@ export AGENT_GRPC_ATTESTATION_POLICY= #### Computation States -| State | Description | Available Actions | -|-------|-------------|-------------------| -| Running | Currently executing | Monitor, View Logs | +| State | Description | Available Actions | +| --------- | --------------------- | ---------------------- | +| Running | Currently executing | Monitor, View Logs | | Completed | Successfully finished | View Results, Download | -| Failed | Execution failed | View Logs, Retry, Edit | -| Cancelled | Manually stopped | View Logs, Edit, Retry | +| Failed | Execution failed | View Logs, Retry, Edit | +| Cancelled | Manually stopped | View Logs, Edit, Retry | #### Monitoring and Troubleshooting - Viewing Logs - - Access computation logs through the computation detail page - - Monitor real-time execution progress - - Identify error conditions and debugging information + - Access computation logs through the computation detail page + - Monitor real-time execution progress + - Identify error conditions and debugging information - Common Issues and Solutions -| Issue | Symptom | Solution | -|-------|---------|----------| -| Certificate Errors | Authentication failures | Verify certificate validity and configuration | -| Network Issues | Connection timeouts | Check firewall rules and network connectivity | -| Resource Constraints | Performance issues | Monitor resource usage, scale VM if needed | +| Issue | Symptom | Solution | +| -------------------- | ----------------------- | --------------------------------------------- | +| Certificate Errors | Authentication failures | Verify certificate validity and configuration | +| Network Issues | Connection timeouts | Check firewall rules and network connectivity | +| Resource Constraints | Performance issues | Monitor resource usage, scale VM if needed | ### Deleting Computations @@ -573,11 +792,13 @@ Prism implements a robust public-key cryptography system for user authentication #### Registration Phase 1. **Generate Key Pair** + - Use CLI tools to generate cryptographic keys - Follow [key generation guide](https://docs.cocos.ultraviolet.rs/cli/#generate-keys) 2. **Register Public Key** ![Upload user key](../static/img/ui/upload-key.png) + - Upload public key when assigned to computation - System associates key with all designated roles @@ -590,7 +811,7 @@ Prism implements a robust public-key cryptography system for user authentication Use the same private key for all CLI operations: - [Algorithm uploads](https://docs.cocos.ultraviolet.rs/cli/#upload-algorithm) -- [Dataset uploads](https://docs.cocos.ultraviolet.rs/cli/#upload-dataset) +- [Dataset uploads](https://docs.cocos.ultraviolet.rs/cli/#upload-dataset) - [Result retrieval](https://docs.cocos.ultraviolet.rs/cli/#retrieve-result) ### Security Best Practices diff --git a/docs/getting-started.md b/docs/getting-started.md index ecc8caa..3cf25e9 100644 --- a/docs/getting-started.md +++ b/docs/getting-started.md @@ -6,16 +6,38 @@ Welcome to Prism! This guide will walk you through setting up your first workspa Prism is a confidential computing platform that enables secure multi-party computations using Trusted Execution Environments (TEEs). It allows multiple parties to collaborate on computations without exposing their sensitive data or algorithms. +## What You'll Build + +In this guide, you'll run a **machine learning model to classify iris flowers**. This demonstrates how one party can provide an algorithm while another provides data, with results going to a third party—all without anyone seeing each other's sensitive information. + +**The Scenario:** + +- **Algorithm Provider**: Provides a Python script that adds numbers +- **Dataset Provider**: Provides a list of numbers to add +- **Result Consumer**: Receives the sum of all numbers + +## Prerequisites + +Before you begin, make sure you have: + +1. **Cocos CLI Tool**: Required for key generation, file uploads, and retrieving results + - Download and install from: [Cocos CLI Repository](https://github.com/ultravioletrs/cocos/releases) + - Verify installation: `./build/cocos-cli --help` (should return some common CLI commands) + +> **Important**: The Cocos CLI is essential for most operations in this guide, including generating keys, uploading algorithms/datasets, and downloading results. Install it now before proceeding. + ## Quick Start Overview 1. **Account Setup** - Create your account and log in 2. **Workspace Creation** - Set up your collaborative environment 3. **CVM Setup** - Create Confidential Virtual Machines for secure computing -4. **Computation Management** - Define, configure, and run secure computations +4. **Computation Management** - Define, configure, and run secure computations. + +![Quick Setup](../static/img/ui/Setup.png) --- -## Account Setup +## 1. Account Setup ### Creating Your Account @@ -38,7 +60,7 @@ After successful login, you'll be directed to the workspaces page where you can --- -## Workspace Management +## 2. Workspace Management ### Understanding Workspaces @@ -78,7 +100,37 @@ If you've been invited to a workspace: --- -## CVM (Confidential Virtual Machine) Setup +## 3. Set Up Your Keys + +Keys serve as your **digital signature for identity verification** in Prism. They prove you authorized uploads and downloads—like a secure signature that can't be forged. + +> **Important**: These keys are for identity/authentication only. All data encryption happens automatically via aTLS (attested TLS) within the secure enclave. + +### Why Two Keys? + +- **Public key**: Upload to Prism to verify your identity when creating assets +- **Private key**: Keep secret on your machine to sign your uploads and downloads + +### Generate Your Keys + +Use the [Cocos CLI tool](https://docs.cocos.ultraviolet.rs/cli#command-keys): + +```bash +./build/cocos-cli keys -k rsa +``` + +![Generated keys](../static/img/getting_started_keys.png) + +**Supported types:** `rsa`, `ecdsa`, `ed25519` + +This creates two files: + +- `public_key.pem` → You'll upload this to Prism shortly +- `private_key.pem` → Keep this secure and never share it + +> **Security Note**: Your private key never leaves your machine. Prism never has access to it. + +## 4. CVM (Confidential Virtual Machine) Setup ### What are CVMs? @@ -115,96 +167,388 @@ After creation, your CVM will go through several states: --- -## Computation Management +## 5. Computation Management + +A computation is a secure collaborative task in Prism. It brings together an algorithm and data from different parties, runs them in an encrypted environment, and delivers results—all without any party seeing the other's sensitive information. + +### Iris Flower Classification in Prism - A Guided Example + +This section demonstrates the complete workflow using a practical example: training a machine learning model to classify iris flowers. You'll learn how to: + +- Create and configure a computation +- Assign roles and permissions +- Link algorithms and datasets +- Execute secure computations +- Interpret your results + +This example uses a logistic regression algorithm to identify iris flower species (Setosa, Versicolor, or Virginica) based on four measurements: sepal length, sepal width, petal length, and petal width. Prism handles the secure computation while you maintain control of your algorithm and data through cryptographic hashes and private keys. + +**Files you'll use:** + +- **Algorithm**: [Logistic Regression script](https://github.com/ultravioletrs/cocos/blob/main/test/manual/algo/lin_reg.py) `lin_reg.py` +- **Dataset**: [Iris flower measurements](https://github.com/ultravioletrs/cocos/blob/main/test/manual/data/iris.csv) (`iris.csv`) + +Download these files locally before proceeding. + +--- + +### The Problem Prism Solves + +**Problem**: One party has valuable data they can't share (due to privacy, regulations, or competitive reasons), and another party has an algorithm they want to keep proprietary. Normally, collaboration is difficult. + +**Prism's solution**: The algorithm runs on the data inside an encrypted environment. The data owner never sees the algorithm, the algorithm owner never sees the data, and the platform can't access either. Only the designated party receives the encrypted results. +This enables secure collaboration that would otherwise be legally or commercially impossible. + +Every computation involves these roles: + +- **Algorithm Provider** – Supplies the code or model to be executed (e.g., a data scientist with a predictive model) +- **Dataset Provider** – Supplies the input data for processing (e.g., a company with proprietary customer data) +- **Result Consumer** – Receives the encrypted computation output (e.g., the company receiving predictions) -### Understanding Computations +Note: The same person or organization can hold multiple roles, or each role can be filled by different parties—the structure adapts to your collaboration needs. -A computation in Prism CoCoS involves multiple parties collaborating securely: +### What You'll Set Up -| Component | Role | Required | Description | -|-----------|------|----------|-------------| -| **Algorithm** | Algorithm Provider | ✅ Required | The secure code to be executed | -| **Dataset** | Dataset Provider | ⚪ Optional | Training or input data (if needed) | -| **Result Consumer** | Result Consumer | ✅ Required | Party authorized to retrieve results | +To create a computation: + +1. **Define the computation** – Name it and describe what it does +2. **Assign roles** – Choose which workspace members fill each role +3. **Link assets** – Connect the specific algorithm and dataset to use +4. **Select a CVM** – Pick where the secure computation will run +5. **Execute** – Run the computation and monitor progress + +In this guide, we'll demonstrate the complete workflow using a practical example: training a machine learning model to classify iris flowers. This example will take you through every step—from creating your computation to interpreting the results. ### Creating a Computation 1. Navigate to **Computations** in your workspace 2. Click **New Computation** -3. Fill in computation details: - - **Name**: Descriptive computation name - - **Description**: Purpose and expected outcomes - - **Agent Configuration**: In enclave agent TLS configurations +2. Fill in details: + - **Name**: "Iris Classification Demo" + - **Description**: "Machine learning classification of iris flower species" + - **Agent Configuration**: Leave default TLS settings + + ![New Computation](../static/img/ui/new_computation.png) + +### Understanding Roles + +Every computation has three required roles that represent different parties in the secure collaboration: -![New Computation](../static/img/ui/new_computation.png) +| Role | What They Provide | Required? | +|------------------------|-------------------|------------| +| **Algorithm Provider** | The code to run | ✅ Yes. | +| **Dataset Provider** | Input data | ⚪ Optional | +| **Result Consumer** | Gets the results | ✅ Yes | -### Setting Up User Roles +> **Example**: Company A provides a fraud-detection algorithm, Company B provides transaction data, Company C receives the fraud report. + +Important constraints: + +- One role per user: Each user can only be assigned to one role per computation +- Need multiple permissions? You can create custom roles or update existing ones to combine actions (e.g., a role that's both Algorithm Provider and Result Consumer) +- Other built-in roles exist: Owner, Viewer, Editor, Runner—for different access levels beyond the core computation roles + +Each role has specific actions that determine what users can do—like `view`, `edit`, `run`, `algo_provider`, `dataset_provider`, or `result_consumer`. [Learn more about managing roles and permissions](./roles.md). #### Step 1: Navigate to Roles 1. Go to your computation's details page 2. Click on **Roles** tab -![Roles Tab](../static/img/ui/roles.png) + ![Roles Tab](../static/img/ui/roles.png) #### Step 2: Assign User Roles -1. Select the role you want to populate (Algorithm Provider, Dataset Provider, or Result Consumer) -2. Click **Add Members** -3. Search for workspace users -4. Select users and confirm assignment +3. **If you're the only user in your workspace**: You automatically have the Owner role. You still do not have ability to act as Algorithm Provider, Dataset Provider, and Result Consumer. To add this: + + - Click on the **Owner** role + ![Roles Tab](../static/img/ui/edit_owner_role.png) + - Click **Add Actions** or edit the role + - Add the actions: `algo_provider`, `dataset_provider`, and `result_consumer` to your Owner role + ![Edit Owner Role](../static/img/ui/add_roles.png) + - This allows you to perform all three functions in this demo as owner, this should be how your role looks like: + ![Role added successfully](../static/img/ui/owner_role_success.jpg) + +4. **If you have multiple users**: For each role (Algorithm Provider, Dataset Provider, Result Consumer): -![Role Assignment](../static/img/ui/view_role.png) -![Add User to Role](../static/img/ui/add_user_to_role.png) + - Click on the role + ![Role Assignment](../static/img/ui/view_role.png) + - Click **Add Members** + ![Role Assignment](../static/img/ui/add_members.png) + - Search for workspace users + - Select users and confirm assignment + ![Add User to Role](../static/img/ui/add_user_to_role.png) > **📝 Note**: Users must already be invited to the workspace before they can be assigned computation roles. A user cannot belong to more than one role, to add permissions you should actions on the required [role](./roles.md). +### Upload a Public Key + +A public key is needed for every computation. + +- Click on Upload Public Key - there will be an indicator that no public key is attached +![Upload public key](../static/img/ui/upload_public_key.png) +- This opens a box for you to select a public key + ![Upload public key success](../static/img/ui/upload_key.png) +- From the [generated keys here](#generate-your-keys), upload your public key and indicator will disappear as below: + ![Upload public key success](../static/img/ui/upload_public_key_success.png) + ### Managing Computation Assets -#### Creating Assets +Each role owner needs to create their asset (algorithm, dataset, etc.) and link it to the computation. Assets are cryptographically verified using file hashes and secured using your public/private key pair. + +> Note: Prism UI only creates asset metadata.Actual files are not uploaded to Prism. +Algorithms and datasets must be uploaded directly to the CVM using the Cocos CLI (via AGENT_GRPC_URL).The CLI is required for uploading files and retrieving results. + +--- + +## Step-by-Step: Building Your Iris Classification Computation + +Now let's walk through creating and executing your computation. + +### Prepare Your Demo Files + +For this demo, you'll use: + +**Algorithm**: [Logistic Regression script](https://github.com/ultravioletrs/cocos/blob/main/test/manual/algo/lin_reg.py) `lin_reg.py` + +**Dataset**: [Iris flower measurements](https://github.com/ultravioletrs/cocos/blob/main/test/manual/data/iris.csv) (`iris.csv`) + +Save these files locally. + +### Generate File Hashes + +For each file, generate its cryptographic hash using the [Cocos CLI](https://docs.cocos.ultraviolet.rs/cli#command-checksum) to be used in the next step: + +```bash +# Hash the algorithm +./build/cocos-cli checksum lin_reg.py + +# Hash the dataset +./build/cocos-cli checksum iris.csv +``` + +![Generate Hashes](../static/img/generate_hashes.png) + +> **Why hashes?** The hash is a unique digital fingerprint that ensures the exact file you specify is used in the computation and hasn't been modified. + +### Create Algorithm Asset + +1. Navigate to **Assets** → **Create New Asset** +2. Select **Algorithm** type +3. Fill in: + + - **Name**: "Iris Classification Model" + - **Description**: "Logistic Regression for iris species classification" + - **File Hash**: Paste the hash from `lin_reg.py` + + ![New Asset](../static/img/ui/new_asset.png) + +4. Click **Create Asset** + +### Link Algorithm to Computation + +1. From your **Assets** page, find "Iris Classification Model" + + ![Algo Assets](../static/img/ui/algo_assets.png) -Users assigned to roles must create their respective assets: +2. Click **Associate** -1. Navigate to **Assets** section -2. Click **Create New Asset** -3. Choose asset type (Algorithm, Dataset, etc.) -4. Upload an optional sample of the asset + ![Associate Algo](../static/img/ui/algo_associate.png) -![New Asset](../static/img/ui/new_asset.png) -![User Assets](../static/img/ui/user_assets.png) +3. Select your computation (e.g., "Iris Classification Demo") +4. Confirm -#### Linking Assets to Computations + ![Associate Asset](../static/img/ui/associate_algo_asset.png) -1. From your assets page, find the relevant asset -2. Click **Link to Computation** -3. Search and select the target computation -4. Confirm the association +### Create Dataset Asset -![Associate Asset](../static/img/ui/associate_user_asset.png) +Repeat the same process for your dataset: -### Running Computations +1. Navigate to **Assets** → **Create New Asset** +2. Select **Dataset** type +3. Fill in: -#### Prerequisites Check + - **Name**: "Iris Dataset" + - **Description**: "150 iris flower measurements for classification" + - **File Hash**: Paste the hash from `iris.csv` -Before running, ensure: + ![New Asset](../static/img/ui/create_dataset.png) -- ✅ All required roles are assigned -- ✅ All necessary assets are linked -- ✅ At least one CVM is online -- ✅ Users have uploaded their public keys +4. Click **Create Asset** +5. Click **Associate** -The **Run Computation** button will be disabled until all requirements are met. + ![Associate Algo](../static/img/ui/dataset_associate.png) + +6. Select your computation (e.g., "Iris Classification Demo") +7. Confirm + + ![Associate Asset](../static/img/ui/associate_dataset_asset.png) +There will be a tag that shows the associated computation for each linked asset: + + ![Associate Asset Success](../static/img/ui/associate_user_asset_success.png) + +#### Ready to Run Your Computation + +You've now completed all the setup steps: + +- ✅ Created a computation + +- ✅ Assigned roles and permissions + +- ✅ Uploaded your public key + +- ✅ Created and linked algorithm and dataset assets + +>Note: The **Run Computation** button will be disabled until all requirements are met. ![Run Disabled](../static/img/ui/run_computation_disabled.png) +>Before running, ensure: + + - ✅ All required roles are assigned + + - ✅ All necessary assets are linked + + - ✅ At least one CVM is online + + - ✅ Users have uploaded their public keys + +With everything in place, the Run Computation button is now enabled. Click it to start your secure computation. #### Executing the Computation 1. Click **Run Computation** (when enabled) 2. Select an available online CVM + ![Select CVM](../static/img/ui/select_cvm.png) 3. Confirm execution + ![Confirm Execution](../static/img/ui/successfully_running_computation.png) + +As you can see in the computation timeline, +the status shows "Receiving algorithm" with "In progress" - this means the CVM is ready. +It is waiting for your files. +Now you need to upload the algorithm and dataset to the computation using the [Cocos CLI tool](https://github.com/ultravioletrs/cocos/releases). + +### Upload Your Files to the CVM + +After creating the asset metadata in Prism, upload the actual files to the Confidential Virtual Machine (CVM). + +#### Set Up Connection + +First, export the agent's gRPC URL to connect to the CVM: + +```bash +export AGENT_GRPC_URL=: +# Example: export AGENT_GRPC_URL=199.92.195.153:61088 +``` + +You can find the AGENT_GRPC_URL on the computations page as shown: + +![Agent URL](../static/img/ui/agent_url.png) + +#### Upload Files + +Then upload your files: + +```bash +# Upload the algorithm +./build/cocos-cli algo lin_reg.py -r +``` + +![Algo Uploaded CLI](../static/img/upload_algo_cli.png) + +NB: The requirements file is optional. + +The state in the UI changes and the CVM is now awaiting the dataset: +![Algo Uploaded](../static/img/ui/upload_algo_prism.png) + +```bash +# Upload the dataset +./build/cocos-cli data iris.csv +``` + +![Dataset Uploaded CLI](../static/img/upload_dataset_cli.png) + +Nothing changes after uploading dataset in the UI but the computation is still in progress. You can monitor through the logs: + +![Dataset Uploaded CLI](../static/img/ui/computation_logs.png) + +The CLI connects to the agent via gRPC and encrypts your files before uploading them to the CVM, where they remain protected throughout the computation. Your private key ensures only you can perform this upload and later decrypt the results. + +More documentation on: [algo](https://docs.cocos.ultraviolet.rs/cli/#command-algo) and [data](https://docs.cocos.ultraviolet.rs/cli/#command-data) + +Once both files are uploaded, the CVM will: + +1. Train the logistic regression model on the iris data +2. Split data into training and testing sets automatically +3. Evaluate the model's performance +4. Generate results including accuracy metrics -![Run Computation](../static/img/ui/run_computation.png) -![Select CVM](../static/img/ui/select_cvm.png) +#### Retrieve Your Computation Results + +Once the computation completes, download and decrypt your results using the [Cocos CLI tool](https://docs.cocos.ultraviolet.rs/cli/#command-result) with your **private key** as a digital identity: + +```bash +./build/cocos-cli result +``` + +**Example output:** +![Stop Computation](../static/img/computation_Results.png) + +The decrypted results will be saved to your specified output file, ready for analysis. + +### Understanding Your Results + +After the computation completes, you'll receive a **`results.zip`** file containing: + +- **Trained model** — Ready to classify new iris flowers +- **Performance metrics** — Accuracy, precision, and recall scores +- **Confusion matrix** — Shows classification accuracy for each species + +#### What Good Results Look Like + +For the Iris dataset, you should typically see: + +- **Training accuracy:** 90-95% +- **Testing accuracy:** 95-100% + +**Example output:** + +```bash +Training Accuracy: 0.93 +Testing Accuracy: 1.00 + + precision recall +Iris-setosa 1.00 1.00 +Iris-versicolor 1.00 1.00 +Iris-virginica 1.00 1.00 +``` + +High accuracy on both training and testing sets means your model learned clear patterns and can reliably classify new flowers. + +### Making Predictions + +Use your trained model to classify new iris flowers: + +```bash +python ./test/manual/algo/lin_reg.py predict results.zip ./test/manual/data +``` + +The model will predict the species for each flower based on its measurements. + +### Tips for Success + +✅ **Verify file hashes** — Ensure hashes match before uploading to guarantee file integrity + +✅ **Keep your private key secure** — It's required for uploading files and decrypting results + +✅ **Check both accuracies** — Training and testing scores should be similar; large gaps may indicate overfitting + +✅ **Save your model** — Keep `results.zip` for future predictions without retraining + +### Next Steps + +- **Try different algorithms** — Compare performance across various classification methods +- **Upload your own datasets** — Apply machine learning to your specific classification problems +- **Explore advanced features** — Adjust hyperparameters for improved model performance ### Monitoring Execution @@ -213,7 +557,7 @@ The **Run Computation** button will be disabled until all requirements are met. Once started, you can monitor: - **Events**: High-level computation milestones -- **Logs**: Detailed execution information from the inenclave agent +- **Logs**: Detailed execution information from the in enclave agent ![Events and Logs](../static/img/ui/logsEvents.png) @@ -227,31 +571,6 @@ You can stop a computation at any time by: ![Stop Computation](../static/img/ui/stop_computation.png) ![Stop Computation Run](../static/img/ui/stop_computation_run.png) ---- - -## Security & Public Keys - -### Why Public Keys Matter - -Public keys are essential for: - -- User identification and authentication -- Secure asset uploads -- Encrypted result retrieval -- Maintaining computation integrity - -### Managing Your Keys - -1. Generate a public/private key pair using your preferred tool or using [Cocos CLI](https://docs.cocos.ultraviolet.rs/cli#command-keys) -2. Upload your public key to your Prism profile -3. Keep your private key secure - you'll need it for: - - [Uploading algorithms and datasets](https://docs.cocos.ultraviolet.rs/cli#command-algo) - - [Retrieving computation results](https://docs.cocos.ultraviolet.rs/cli#command-data) - -> **🔐 Security Best Practice**: Never share your private key. Prism only needs your public key for verification. - ---- - ## Troubleshooting ### Common Issues diff --git a/static/img/computation_Results.png b/static/img/computation_Results.png new file mode 100644 index 0000000..04f33b6 Binary files /dev/null and b/static/img/computation_Results.png differ diff --git a/static/img/generate_hashes.png b/static/img/generate_hashes.png new file mode 100644 index 0000000..a175580 Binary files /dev/null and b/static/img/generate_hashes.png differ diff --git a/static/img/getting_started_keys.png b/static/img/getting_started_keys.png new file mode 100644 index 0000000..b83e473 Binary files /dev/null and b/static/img/getting_started_keys.png differ diff --git a/static/img/sample_upload.png b/static/img/sample_upload.png new file mode 100644 index 0000000..be89479 Binary files /dev/null and b/static/img/sample_upload.png differ diff --git a/static/img/ui/Setup.png b/static/img/ui/Setup.png new file mode 100644 index 0000000..a16ff3d Binary files /dev/null and b/static/img/ui/Setup.png differ diff --git a/static/img/ui/add_members.png b/static/img/ui/add_members.png new file mode 100644 index 0000000..397af72 Binary files /dev/null and b/static/img/ui/add_members.png differ diff --git a/static/img/ui/add_roles.png b/static/img/ui/add_roles.png new file mode 100644 index 0000000..d8e0c22 Binary files /dev/null and b/static/img/ui/add_roles.png differ diff --git a/static/img/ui/add_user_to_role.png b/static/img/ui/add_user_to_role.png index 9b4dc6e..abc9612 100644 Binary files a/static/img/ui/add_user_to_role.png and b/static/img/ui/add_user_to_role.png differ diff --git a/static/img/ui/agent_url.png b/static/img/ui/agent_url.png new file mode 100644 index 0000000..b4abdc9 Binary files /dev/null and b/static/img/ui/agent_url.png differ diff --git a/static/img/ui/algo_assets.png b/static/img/ui/algo_assets.png new file mode 100644 index 0000000..5f5061f Binary files /dev/null and b/static/img/ui/algo_assets.png differ diff --git a/static/img/ui/algo_associate.png b/static/img/ui/algo_associate.png new file mode 100644 index 0000000..c226cc1 Binary files /dev/null and b/static/img/ui/algo_associate.png differ diff --git a/static/img/ui/associate_algo_asset.png b/static/img/ui/associate_algo_asset.png new file mode 100644 index 0000000..8c583c1 Binary files /dev/null and b/static/img/ui/associate_algo_asset.png differ diff --git a/static/img/ui/associate_dataset_asset.png b/static/img/ui/associate_dataset_asset.png new file mode 100644 index 0000000..d1cde6d Binary files /dev/null and b/static/img/ui/associate_dataset_asset.png differ diff --git a/static/img/ui/associate_user_asset.png b/static/img/ui/associate_user_asset.png deleted file mode 100644 index 16c6ba9..0000000 Binary files a/static/img/ui/associate_user_asset.png and /dev/null differ diff --git a/static/img/ui/associate_user_asset_success.png b/static/img/ui/associate_user_asset_success.png new file mode 100644 index 0000000..55c2181 Binary files /dev/null and b/static/img/ui/associate_user_asset_success.png differ diff --git a/static/img/ui/computation_agent_manifest.png b/static/img/ui/computation_agent_manifest.png new file mode 100644 index 0000000..0d2a204 Binary files /dev/null and b/static/img/ui/computation_agent_manifest.png differ diff --git a/static/img/ui/computation_import_export.png b/static/img/ui/computation_import_export.png new file mode 100644 index 0000000..16680a5 Binary files /dev/null and b/static/img/ui/computation_import_export.png differ diff --git a/static/img/ui/computation_logs.png b/static/img/ui/computation_logs.png new file mode 100644 index 0000000..aa62b57 Binary files /dev/null and b/static/img/ui/computation_logs.png differ diff --git a/static/img/ui/create_dataset.png b/static/img/ui/create_dataset.png new file mode 100644 index 0000000..b9de886 Binary files /dev/null and b/static/img/ui/create_dataset.png differ diff --git a/static/img/ui/dataset_associate.png b/static/img/ui/dataset_associate.png new file mode 100644 index 0000000..d2b80c7 Binary files /dev/null and b/static/img/ui/dataset_associate.png differ diff --git a/static/img/ui/edit_owner_role.png b/static/img/ui/edit_owner_role.png new file mode 100644 index 0000000..c86d30a Binary files /dev/null and b/static/img/ui/edit_owner_role.png differ diff --git a/static/img/ui/login.png b/static/img/ui/login.png index 7d1962a..e937eba 100644 Binary files a/static/img/ui/login.png and b/static/img/ui/login.png differ diff --git a/static/img/ui/new_asset.png b/static/img/ui/new_asset.png index c5081cb..2a544df 100644 Binary files a/static/img/ui/new_asset.png and b/static/img/ui/new_asset.png differ diff --git a/static/img/ui/new_computation.png b/static/img/ui/new_computation.png index b3dcfea..e39b9cb 100644 Binary files a/static/img/ui/new_computation.png and b/static/img/ui/new_computation.png differ diff --git a/static/img/ui/owner_role_success.jpg b/static/img/ui/owner_role_success.jpg new file mode 100644 index 0000000..57591cc Binary files /dev/null and b/static/img/ui/owner_role_success.jpg differ diff --git a/static/img/ui/roles.png b/static/img/ui/roles.png index 73a7830..ef7c977 100644 Binary files a/static/img/ui/roles.png and b/static/img/ui/roles.png differ diff --git a/static/img/ui/successfully_running_computation.png b/static/img/ui/successfully_running_computation.png new file mode 100644 index 0000000..23783c5 Binary files /dev/null and b/static/img/ui/successfully_running_computation.png differ diff --git a/static/img/ui/update_dataset.png b/static/img/ui/update_dataset.png new file mode 100644 index 0000000..2def158 Binary files /dev/null and b/static/img/ui/update_dataset.png differ diff --git a/static/img/ui/upload_algo_prism.png b/static/img/ui/upload_algo_prism.png new file mode 100644 index 0000000..0dd8399 Binary files /dev/null and b/static/img/ui/upload_algo_prism.png differ diff --git a/static/img/ui/upload_key.png b/static/img/ui/upload_key.png new file mode 100644 index 0000000..c5d37fa Binary files /dev/null and b/static/img/ui/upload_key.png differ diff --git a/static/img/ui/upload_public_key.png b/static/img/ui/upload_public_key.png new file mode 100644 index 0000000..9bbd540 Binary files /dev/null and b/static/img/ui/upload_public_key.png differ diff --git a/static/img/ui/upload_public_key_success.png b/static/img/ui/upload_public_key_success.png new file mode 100644 index 0000000..ad80b71 Binary files /dev/null and b/static/img/ui/upload_public_key_success.png differ diff --git a/static/img/ui/user_assets.png b/static/img/ui/user_assets.png index 3e4788b..8f72d41 100644 Binary files a/static/img/ui/user_assets.png and b/static/img/ui/user_assets.png differ diff --git a/static/img/ui/view_role.png b/static/img/ui/view_role.png index 6bb3aec..2e1ab54 100644 Binary files a/static/img/ui/view_role.png and b/static/img/ui/view_role.png differ diff --git a/static/img/upload_algo_cli.png b/static/img/upload_algo_cli.png new file mode 100644 index 0000000..1b4256b Binary files /dev/null and b/static/img/upload_algo_cli.png differ diff --git a/static/img/upload_dataset_cli.png b/static/img/upload_dataset_cli.png new file mode 100644 index 0000000..273bcc1 Binary files /dev/null and b/static/img/upload_dataset_cli.png differ