A complete hackathon-ready monorepo combining AI/ML, Cybersecurity, and Oracle Cloud Infrastructure (OCI). This project demonstrates a full-stack cloud-native application with infrastructure as code, machine learning inference, and a modern web interface.
git clone <your-repo-url>
cd cloud-security-ai
cd backend && python3 train.py && cd .. # Train ML model
make docker-up # Start everything
# Open http://localhostgit clone <your-repo-url>
cd cloud-security-ai
make setup-dev # Setup Python & Node.js environments
make run-backend # Terminal 1
make run-frontend # Terminal 2π Detailed Documentation:
- Team Developer Guide - Complete guide for team members
- Docker Setup - Docker deployment guide
- Local Setup - Manual setup instructions
- Quick Reference - Command cheat sheet
- Architecture Diagrams - Visual flowcharts and diagrams
graph TB
subgraph Browser["π Web Browser"]
UI[User Interface<br/>localhost:5173]
end
subgraph Frontend["βοΈ Frontend"]
React[React App<br/>Dashboard UI]
API_Client[API Client]
end
subgraph Backend["π Backend"]
FastAPI[FastAPI Server<br/>Port 8000]
Routes[API Routes]
ModelMgr[Model Manager]
ML[ML Model<br/>model.joblib]
end
subgraph Cloud["βοΈ OCI Cloud"]
VCN[VCN + Subnet]
Compute[Compute Instance]
Storage[Object Storage]
end
UI --> React
React --> API_Client
API_Client -->|HTTP| FastAPI
FastAPI --> Routes
Routes --> ModelMgr
ModelMgr --> ML
Compute -.optional.-> FastAPI
Storage -.optional.-> ML
style React fill:#61dafb
style FastAPI fill:#009688
style ML fill:#ff6b6b
style Cloud fill:#f4f4f4
cloud-security-ai/
βββ infra/ # Terraform infrastructure
βββ backend/ # FastAPI backend
β βββ main.py # Entry point
β βββ train.py # Model training
β βββ app/routers/ # API endpoints
βββ frontend/ # React frontend
β βββ src/components/ # UI components
βββ docker-compose.yml # Docker orchestration
βββ Makefile # Common commands
flowchart LR
Start([Start]) --> Choose{Setup?}
Choose -->|Docker| D[make docker-up]
Choose -->|Local| L[make setup-dev]
D --> Run[β
Running]
L --> Run
Run --> Code[Edit Code]
Code --> Test[Test]
Test --> Works{Works?}
Works -->|Yes| Deploy[Deploy]
Works -->|No| Debug[Debug]
Debug --> Code
Deploy --> Done([β¨ Done])
style Start fill:#4CAF50
style Run fill:#2196F3
style Done fill:#FF9800
β βββ config.js # API configuration β βββ components/ # React components β β βββ Dashboard.jsx # Main dashboard β βββ services/ # Service layer β βββ api.js # API client β βββ Makefile # Convenient commands βββ README.md # This file βββ .gitignore # Git ignore rules
---
## π― How It Works
### Prediction Flow
```mermaid
sequenceDiagram
participant User
participant Frontend
participant Backend
participant ML Model
User->>Frontend: Enter security data
Frontend->>Backend: POST /api/predict
Backend->>ML Model: Run inference
ML Model-->>Backend: Prediction result
Backend-->>Frontend: JSON response
Frontend-->>User: Display result
Before starting, ensure you have:
- OCI Account - Oracle Cloud Infrastructure account (Sign up for free)
- OCI CLI - Installed and configured (
oci setup config) - Terraform - Version 1.0+ (Download)
- Python 3.13+ - For backend development
- Node.js 18+ - For frontend development
- SSH Key Pair - For accessing the compute instance
- Run OCI setup (if not already done):
oci setup config- Follow the prompts to configure your profile (DEFAULT profile will be used)
ssh-keygen -t rsa -b 4096 -f ~/.ssh/hackathon_keyThis creates:
~/.ssh/hackathon_key(private key)~/.ssh/hackathon_key.pub(public key)
- Copy the example file:
cd infra
cp terraform.tfvars.example terraform.tfvars- Edit
terraform.tfvarsand fill in required values:
# REQUIRED - Get from OCI Console
tenancy_ocid = "ocid1.tenancy.oc1..aaaaaaxxxxxx"
# REQUIRED - Your SSH public key
ssh_public_key = "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDxxxxx..."
# OPTIONAL - Adjust as needed
region = "us-ashburn-1"
compartment_name = "hackathon-cloud-security-ai"How to get your Tenancy OCID:
- Log in to OCI Console β Click your profile icon β Tenancy: <name> β Copy OCID
How to get your SSH public key:
cat ~/.ssh/hackathon_key.pub# Initialize Terraform
make init
# Review the execution plan
make plan
# Create resources in OCI
make applyNote: Type yes when prompted to confirm.
This will create:
- β Compartment for organizing resources
- β VCN with public subnet and Internet Gateway
- β Object Storage bucket for ML models/data
- β Compute instance (Always Free tier)
Expected time: 2-5 minutes
make outputSave the instance_public_ip - you'll need it for deployment!
- Install Python dependencies:
cd backend
pip install -r requirements.txt- Configure environment:
cp .env.example .env
# Edit .env if needed- Train the ML model:
python train.py- Run the backend locally:
make run-backendThe API will be available at: http://localhost:8000
- Docs:
http://localhost:8000/docs - Health:
http://localhost:8000/health
- Install Node dependencies:
cd frontend
npm install- Configure environment:
cp .env.example .envEdit .env:
VITE_API_URL=http://localhost:8000- Run the frontend:
npm run devThe frontend will be available at: http://localhost:5173
Deploy the backend to your OCI compute instance:
make deploy-backendThis will:
- Copy backend files to the instance
- Build a Docker container
- Run the backend on port 8000
Update frontend .env to use the OCI instance:
VITE_API_URL=http://<instance_public_ip>:8000make init # Initialize Terraform
make plan # Show execution plan
make apply # Create/update resources
make destroy # Destroy all resources
make output # Show outputs (IPs, IDs, etc.)make run-backend # Run backend locally
make train-model # Train ML model
make test-backend # Test backend APImake run-frontend # Run frontend locally
make build-frontend # Build for productionmake ssh-vm # SSH into OCI instance
make deploy-backend # Deploy backend to OCImake setup-dev # Set up development environment
make clean # Clean build artifacts- Health check:
curl http://localhost:8000/health- Make a prediction:
curl -X POST http://localhost:8000/api/predict \
-H "Content-Type: application/json" \
-d '{"features": [1.5, 2.3, 4.1, 0.8]}'- Get model info:
curl http://localhost:8000/api/model/info- Open
http://localhost:5173in your browser - Enter feature values in the dashboard
- Click "Predict" to get results
- View the prediction and confidence score
- β Basic security for quick deployment
β οΈ Firewall allows all IPs (0.0.0.0/0)β οΈ CORS allows all origins
Update the following:
1. Restrict SSH Access (infra/main.tf):
ingress_security_rules {
source = "YOUR_IP_ADDRESS/32" # Only your IP
# ... rest of SSH rule
}2. Restrict Backend Access:
- Use a reverse proxy (Nginx)
- Enable HTTPS with SSL certificates
- Restrict CORS origins
3. API Security:
- Add authentication (JWT tokens)
- Implement rate limiting
- Use API keys for external access
4. Environment Variables:
- Never commit
.envor.tfvarsfiles - Use OCI Secrets for sensitive data
The project includes a dummy Random Forest classifier for demonstration. To use your own model:
- Prepare your training data (CSV format)
- Train the model:
cd backend
python train.py --data your_data.csv --output ./app/ml_models/model.joblib- Restart the backend
The model manager (backend/app/ml_models/model_manager.py) can be extended to support different model types (XGBoost, Neural Networks, etc.).
| Endpoint | Method | Description |
|---|---|---|
/health |
GET | Basic health check |
/health/detailed |
GET | Detailed health with metrics |
/api/predict |
POST | Make a prediction |
/api/predict/batch |
POST | Batch predictions |
/api/model/info |
GET | Model information |
/api/ingest |
POST | Ingest JSON data |
/api/ingest/file |
POST | Upload CSV file |
/api/ingest/stats |
GET | Ingestion statistics |
Problem: "Error: 401-NotAuthenticated"
Solution: Run oci setup config to configure credentials
Problem: "Error: compartment not found"
Solution: Ensure tenancy_ocid is correct in terraform.tfvars
Problem: "Import errors" when running backend
Solution: Install dependencies: pip install -r requirements.txt
Problem: "Port 8000 already in use"
Solution: Kill the process: lsof -ti:8000 | xargs kill -9
Problem: "Cannot connect to backend" Solution:
- Check backend is running
- Verify
VITE_API_URLin.env - Check CORS settings in backend
Problem: "npm install fails"
Solution: Delete node_modules and try again
Problem: "Permission denied" when SSH to instance Solution:
- Check key permissions:
chmod 400 ~/.ssh/hackathon_key - Verify public key in
terraform.tfvars
Problem: "Docker command not found" on instance Solution: Wait 2-3 minutes for cloud-init to complete
This project is created for the Oracle Hackathon 2025. Feel free to use and modify for your hackathon submissions.
- Application: Review the individual README files in
backend/andfrontend/
Before your demo:
- Infrastructure provisioned successfully
- Backend running on OCI instance
- Frontend can connect to backend
- ML model trained and loaded
- Test prediction works end-to-end
- Screenshots/demo video prepared
- Code committed to Git (without secrets!)