Skip to content
Closed

HITL #488

Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
@@ -0,0 +1,146 @@
**IMPORTANT!** All samples and other resources made available in this GitHub repository ("samples") are designed to assist in accelerating development of agents, solutions, and agent workflows for various scenarios. Review all provided resources and carefully test output behavior in the context of your use case. AI responses may be inaccurate and AI actions should be monitored with human oversight. Learn more in the transparency documents for [Agent Service](https://learn.microsoft.com/en-us/azure/ai-foundry/responsible-ai/agents/transparency-note) and [Agent Framework](https://github.com/microsoft/agent-framework/blob/main/TRANSPARENCY_FAQ.md).

Agents, solutions, or other output you create may be subject to legal and regulatory requirements, may require licenses, or may not be suitable for all industries, scenarios, or use cases. By using any sample, you are acknowledging that any output created using those samples are solely your responsibility, and that you will comply with all applicable laws, regulations, and relevant safety standards, terms of service, and codes of conduct.

Third-party samples contained in this folder are subject to their own designated terms, and they have not been tested or verified by Microsoft or its affiliates.

Microsoft has no responsibility to you or others with respect to any of these samples or any resulting output.

# What this sample demonstrates

This sample demonstrates how to build a Microsoft Agent Framework chat agent that can use **Foundry tools**
(for example, web search and MCP tools), host it using the
[Azure AI AgentServer SDK](https://pypi.org/project/azure-ai-agentserver-agentframework/),
and deploy it to Microsoft Foundry using the Azure Developer CLI [ai agent](https://learn.microsoft.com/en-us/azure/ai-foundry/agents/concepts/hosted-agents?view=foundry&tabs=cli#create-a-hosted-agent) extension.

## How It Works

### Foundry tools integration

In [main.py](main.py), the agent is created using `AzureOpenAIChatClient` and is configured with
`FoundryToolsChatMiddleware`. The middleware enables tool usage via Foundry-supported tool types:

- `web_search_preview` (foundry configured tools)
- `mcp` (connected mcp tool, configured with a Foundry project connection id)

### Agent Hosting

The agent is hosted using the [Azure AI AgentServer SDK](https://pypi.org/project/azure-ai-agentserver-agentframework/),
which provisions a REST API endpoint compatible with the OpenAI Responses protocol. This allows interaction with the agent using OpenAI Responses compatible clients.

### Agent Deployment

The hosted agent can be seamlessly deployed to Microsoft Foundry using the Azure Developer CLI [ai agent](https://learn.microsoft.com/en-us/azure/ai-foundry/agents/concepts/hosted-agents?view=foundry&tabs=cli#create-a-hosted-agent) extension.
The extension builds a container image into Azure Container Registry (ACR), and creates a hosted agent version and deployment on Microsoft Foundry.

## Running the Agent Locally

### Prerequisites

Before running this sample, ensure you have:

1. **Azure OpenAI Service**
- Endpoint configured
- Chat model deployed (e.g., `gpt-4o-mini` or `gpt-4`)
- Note your endpoint URL and deployment name

2. **Azure AI Foundry Project**
- Project created in [Azure AI Foundry](https://learn.microsoft.com/en-us/azure/ai-foundry/what-is-foundry?view=foundry#microsoft-foundry-portals)
- Add 'Microsoft Learn' MCP from foundry tool catalog.
![microsoft_learn](microsoft_learn.png)

3. **Azure CLI**
- Installed and authenticated
- Run `az login` and verify with `az account show`

4. **Python 3.10 or higher**
- Verify your version: `python --version`
- If you have Python 3.9 or older, install a newer version:
- Windows: `winget install Python.Python.3.12`
- macOS: `brew install python@3.12`
- Linux: Use your package manager

### Environment Variables

Set the following environment variables:

- `AZURE_OPENAI_ENDPOINT` - Your Azure OpenAI endpoint URL (required)
- `AZURE_OPENAI_CHAT_DEPLOYMENT_NAME` - The deployment name for your chat model (required)
- `AZURE_AI_PROJECT_ENDPOINT` - Your Azure AI Foundry project endpoint (required)
- `AZURE_AI_PROJECT_TOOL_CONNECTION_ID` - Foundry project connection id used to configure the `mcp` tool (optional)

This sample loads environment variables from a local `.env` file if present.

**Finding your tool connection id** (portal names may vary):
1. Go to [Azure AI Foundry portal](https://ai.azure.com)
2. Navigate to your project -> Build -> Tools
3. Find your connected MCP tool (e.g., "Microsoft Learn")
4. Copy your tool's name and set it as `AZURE_AI_PROJECT_TOOL_CONNECTION_ID`

```powershell
# Replace with your actual values
$env:AZURE_OPENAI_ENDPOINT="https://your-openai-resource.openai.azure.com/"
$env:AZURE_OPENAI_CHAT_DEPLOYMENT_NAME="gpt-4o-mini"
$env:AZURE_AI_PROJECT_ENDPOINT="https://{resource}.services.ai.azure.com/api/projects/{project-name}"
$env:AZURE_AI_PROJECT_TOOL_CONNECTION_ID="<your-tool-connection-id>"
```

### Installing Dependencies

Install the required Python dependencies using pip:

```powershell
pip install -r requirements.txt
```

### Running the Sample

To run the agent, execute the following command in your terminal:

```powershell
python main.py
```

This will start the hosted agent locally on `http://localhost:8088/`.

### Interacting with the Agent

**PowerShell (Windows):**
```powershell
$body = @{
input = "How to deploy foundry hosted agents?"
stream = $false
} | ConvertTo-Json

Invoke-RestMethod -Uri http://localhost:8088/responses -Method Post -Body $body -ContentType "application/json"
```

**Bash/curl (Linux/macOS):**
```bash
curl -sS -H "Content-Type: application/json" -X POST http://localhost:8088/responses \
-d '{"input": "How to deploy foundry hosted agents?","stream":false}'
```

The agent may use Foundry tools (for example `web_search_preview` and/or `mcp`) as needed to answer.

### Deploying the Agent to Microsoft Foundry

To deploy your agent to Microsoft Foundry, follow the comprehensive deployment guide at https://learn.microsoft.com/en-us/azure/ai-foundry/agents/concepts/hosted-agents?view=foundry&tabs=cli

## Troubleshooting

### Images built on Apple Silicon or other ARM64 machines do not work on our service

We **recommend using `azd` cloud build**, which always builds images with the correct architecture.

If you choose to **build locally**, and your machine is **not `linux/amd64`** (for example, an Apple Silicon Mac), the image will **not be compatible with our service**, causing runtime failures.

**Fix for local builds**

Use this command to build the image locally:

```shell
docker build --platform=linux/amd64 -t image .
```

This forces the image to be built for the required `amd64` architecture.
Original file line number Diff line number Diff line change
@@ -1,21 +1,20 @@
# Unique identifier/name for this agent
name: agent-with-hosted-mcp
name: af-agent-with-foundry-tools
# Brief description of what this agent does
description: >
An AI agent that uses Azure OpenAI with a Hosted Model Context Protocol (MCP) server.
The agent answers questions by searching Microsoft Learn documentation using MCP tools.
metadata:
# Categorization tags for organizing and discovering agents
authors:
- Microsoft Agent Framework Team
- Microsoft
tags:
- Azure AI AgentServer
- Microsoft Agent Framework
- Model Context Protocol
- MCP
template:
name: agent-with-hosted-mcp
# The type of agent - "hosted" for HOBO, "container" for COBO
name: af-agent-with-foundry-tools
kind: hosted
protocols:
- protocol: responses
Expand All @@ -24,6 +23,8 @@ template:
value: ${AZURE_OPENAI_ENDPOINT}
- name: AZURE_OPENAI_CHAT_DEPLOYMENT_NAME
value: "{{chat}}"
- name: AZURE_AI_PROJECT_TOOL_CONNECTION_ID
value: ""
resources:
- kind: model
id: gpt-4o-mini
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
import os
from dotenv import load_dotenv
from agent_framework.azure import AzureOpenAIChatClient

from azure.ai.agentserver.agentframework import from_agent_framework, FoundryToolsChatMiddleware
from azure.identity import DefaultAzureCredential

# Load environment variables from .env file for local development
# load_dotenv()

def main():
required_env_vars = [
"AZURE_OPENAI_ENDPOINT",
"AZURE_OPENAI_CHAT_DEPLOYMENT_NAME",
"AZURE_AI_PROJECT_ENDPOINT",
]
for env_var in required_env_vars:
assert env_var in os.environ and os.environ[env_var], (
f"{env_var} environment variable must be set."
)

tools=[{"type": "web_search_preview"}]
if project_tool_connection_id := os.environ.get("AZURE_AI_PROJECT_TOOL_CONNECTION_ID"):
tools.append({"type": "mcp", "project_connection_id": project_tool_connection_id})

chat_client = AzureOpenAIChatClient(credential=DefaultAzureCredential(),
middleware=FoundryToolsChatMiddleware(tools))
agent = chat_client.create_agent(
name="FoundryToolAgent",
instructions="You are a helpful assistant with access to various tools."
)

from_agent_framework(agent).run()

if __name__ == "__main__":
main()
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
azure_ai_agentserver_agentframework-1.0.0b9
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ def create_agent():

def main():
# Run the agent as a hosted agent
from_agent_framework(lambda _: create_agent()).run()
from_agent_framework(create_agent()).run()


if __name__ == "__main__":
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
azure-ai-agentserver-agentframework==1.0.0b9
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ def create_agent():

def main():
# Run the agent as a hosted agent
from_agent_framework(lambda _: create_agent()).run()
from_agent_framework(create_agent()).run()


if __name__ == "__main__":
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
azure-ai-agentserver-agentframework==1.0.0b9
Original file line number Diff line number Diff line change
Expand Up @@ -152,4 +152,4 @@ def create_agent() -> EchoAgent:
return agent

if __name__ == "__main__":
from_agent_framework(lambda _: create_agent()).run()
from_agent_framework(create_agent()).run()
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
azure-ai-agentserver-agentframework==1.0.0b8
azure-ai-agentserver-agentframework==1.0.0b9
pytest==8.4.2
python-dotenv==1.1.1
azure-monitor-opentelemetry==1.8.1
Original file line number Diff line number Diff line change
Expand Up @@ -44,4 +44,4 @@ def create_agent() -> ChatAgent:
return agent

if __name__ == "__main__":
from_agent_framework(lambda _: create_agent()).run()
from_agent_framework(create_agent()).run()
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
azure-ai-agentserver-agentframework==1.0.0b8
azure-ai-agentserver-agentframework==1.0.0b9
pytest==8.4.2
python-dotenv==1.1.1
azure-monitor-opentelemetry==1.8.1

This file was deleted.

Loading
Loading