Skip to content

agentic-layer/showcase-cross-selling

Repository files navigation

Insurance Cross-Selling Agentic System

A multi-agent system for intelligent insurance cross-selling built with Google's Agent Development Kit (ADK), Model Context Protocol (MCP) servers, and agent-to-agent communication. This system orchestrates insurance cross-selling opportunities by analyzing customer data, identifying suitable products, and coordinating customer communications through specialized AI agents.

This showcase demonstrates the capabilities of the Agentic Layer platform for building complex multi-agent AI systems. Further information about the Agentic Layer can be found in our documentation.


Table of Contents


Introduction

In this showcase, a host agent uses tools and calls other agents to facilitate user requests like so:

Conversation 1:

Nutzer: Bitte bereite mir ein Kundengespräch mit der Kundin Anna Müller vor.

Agent: Gern helfe ich dir, das Gespräch vorzubereiten. Hier sind Optionen für verschiedene Verkaufsstrategien: …

Nutzer: Bitte verschicke eine E-Mail an die Kundin mit einer Agenda.

Agent: Erledigt.

Conversation 2:

Nutzer: Bitte bereite mir ein Kundengespräch mit dem Kunden Thomas Schmidt vor. Sende außerdem eine Erinnerungs-Mail an den Kunden, dass das Gespräch stattfindet.

Agent: Hier ist die Vorbereitung für dein Gespräch. Die Erinnerungsmail habe ich verschickt: …


Prerequisites

The following tools and dependencies are required to run this project:

  • Python 3.13+: Required for all agent components and MCP servers
  • Google Cloud SDK: For ADK and Vertex AI integration
  • uv 0.5.0+: Python package manager for dependency management
  • Tilt: Kubernetes development environment orchestration
  • Docker: For containerization and local Kubernetes
  • Google Cloud Account: With access to Vertex AI or Google AI APIs
  • Slack Bot Token (optional): For communications agent integration

Getting Started

1. Install Dependencies

# Install system dependencies via Homebrew
brew bundle
# Install Python dependencies
uv sync --directory mcp-servers

2. Authentication Setup

# Authenticate with Google Cloud for AI model access
gcloud auth application-default login

3. Environment Configuration

Create a .env file in the root directory with the following content:

# Google Cloud Configuration
GOOGLE_GENAI_USE_VERTEXAI=FALSE
GOOGLE_CLOUD_PROJECT=qaware-paal
GOOGLE_CLOUD_LOCATION=europe-west3
GOOGLE_API_KEY=your-google-api-key

# LiteLLM Api Key. Defaults to the master key (optional)
LITELLM_PROXY_API_KEY=sk-your-api-key

4. Start the Application

Launch all services using Tilt:

# Start core agents and MCP servers
tilt up

Tilt Profiles

Optional components can be enabled using profiles. Specify one or more profiles with --profile:

# Start with LibreChat UI
tilt up -- --profile librechat

# Start with Testbench for RAGAS evaluations
tilt up -- --profile testbench

# Combine multiple profiles
tilt up -- --profile librechat --profile testbench
Profile Description
librechat Deploys a LibreChat instance as a chat UI for interacting with the insurance host agent. Available at http://localhost:11003.
testbench Deploys the Testbench with TestKube for running RAGAS evaluation TestWorkflows against the agents.

Expected Results:

With profiles enabled:

Helm Chart

This project provides a Helm chart for deploying the showcase to Kubernetes clusters.

Installing from OCI Registry

The Helm chart is published to GitHub Container Registry for each release tag. You need to install the Agentic Layer components first, see https://docs.agentic-layer.ai.

# Install the latest release
helm install showcase-cross-selling \
  oci://ghcr.io/agentic-layer/charts/showcase-cross-selling \
  --version 0.6.0 \
  --namespace showcase-cross-selling \
  --create-namespace

Development

Developer Setup

For detailed contributing guidelines, refer to the global contributing guide.

Mandatory first step for contributors:

# Activate pre-commit hooks
pre-commit install

Code Quality Standards

Code Style:

  • Linting: Ruff with 120 character line limit
  • Type Checking: mypy for static type analysis
  • Security: Bandit for security vulnerability detection
  • Import Organization: import-linter for dependency management

Development Commands:

# Run all quality checks
uv run --directory mcp-servers poe check

# Individual checks
uv run --directory mcp-servers poe mypy          # Type checking
uv run --directory mcp-servers poe ruff          # Linting and formatting
uv run --directory mcp-servers poe bandit        # Security analysis
uv run --directory mcp-servers poe lint-imports  # Import dependency validation
uv run --directory mcp-servers poe test          # Execute test suite

# Auto-formatting
uv run --directory mcp-servers poe format        # Code formatting
uv run --directory mcp-servers poe lint          # Auto-fix linting issues

End-to-End (E2E) Testing

Running E2E Tests

Execute the end-to-end test suite to validate the complete agent workflow:

# Run the cross-selling conversation test
./test/e2e/a2a-message.sh

Prerequisites for E2E Tests:

  • All services must be running (tilt up)

Test Coverage:

  • Cross-selling strategy generation for customer Anna Müller
  • Agent-to-agent communication validation
  • OpenAI-compatible API endpoint functionality
  • Response content validation for German language interactions

Running RAGAS Evaluation TestWorkflows

The project includes TestKube TestWorkflows for automated RAGAS evaluation of agents. To run them manually:

# Run the cross-selling agent evaluation
testkube run tw cross-selling-ragas-evaluation

# Run the insurance host agent evaluation
testkube run tw insurance-host-ragas-evaluation

Results can be viewed in the Workflow Evaluations Dashboard in Grafana (http://localhost:11000).

Prerequisites:

  • All agent services must be running (tilt up)
  • TestKube CLI must be installed

Testing Tools and Their Configuration

Testing Framework

Primary Tool: Bash/cURL Integration Tests

  • Location: test/e2e/a2a-message.sh
  • Configuration: Tests use OpenAI-compatible API endpoints
  • Validation: Response content matching using grep with German keywords

Example Test Configuration:

# API endpoint configuration
API_ENDPOINT="http://localhost:11002/api/v1/chat/completions"
MODEL_NAME="insurance_host_agent"
TIMEOUT="90"  # seconds
# Content validation patterns
EXPECTED_PATTERNS="cust001\|cross.sell\|strategie\|kunde"

Sample Data

Customer CRM Data

The system includes mock customer data accessible through the Customer CRM MCP server:

Sample Customer Record (Anna Müller):

{
  "customer_id": "cust001",
  "name": "Anna Müller",
  "current_policies": [
    "auto_insurance",
    "home_insurance"
  ],
  "demographics": {
    "age": 35,
    "location": "Munich",
    "income_level": "middle"
  }
}

Sample API Request:

# Test cross-selling recommendation
curl -X POST http://localhost:11002/insurance-host-agent \
    -H "Content-Type: application/json" \
    -d '{
      "jsonrpc": "2.0",
      "id": 1,
      "method": "message/send",
      "params": {
        "message": {
          "role": "user",
          "parts": [
            {
              "kind": "text",
              "text": "Welche Cross-Selling-Möglichkeiten gibt es für unsere Kundin Anna Müller?"
            }
          ],
          "messageId": "9229e770-767c-417b-a0b0-f0741243c589",
          "contextId": "abcd1234-5678-90ab-cdef-1234567890ab"
        },
        "metadata": {"conversationId": "9229e770-767c-417b-a0b0-f0741243c589"}
      }
    }'

Database Seeding: Customer and product data is automatically initialized when MCP servers start. No manual seeding required.

Project Architecture

mcp-servers/
├── Dockerfile              # Single image, server selected via CMD
├── pyproject.toml          # Unified Python project
└── src/
    ├── shared/             # Shared utilities (otel, middleware, response helpers)
    ├── customer_crm/       # Customer relationship management data
    └── insurance_products/ # Insurance product catalog server

About

No description, website, or topics provided.

Resources

Code of conduct

Contributing

Stars

Watchers

Forks

Contributors