This repository contains a comprehensive collection of Python-based tutorials for getting started with AI development. While primarily focused on Azure OpenAI, it also includes tutorials for other AI platforms like Ollama and Model Context Protocol (MCP) implementations.
Each tutorial builds upon the previous ones, demonstrating progressively advanced concepts and techniques.
Before diving into the tutorials, we strongly recommend reading these two foundational documents in order:
-
LLMs_explained.md
: A comprehensive guide that explains how Large Language Models work, covering essential concepts like tokens, embeddings, and the transformer architecture. This will give you a solid understanding of the technology you'll be working with. -
ai_solution_building_guide.md
: A practical guide that covers key architectural patterns and best practices for building AI solutions, including:- Workflows vs. Agents: Understanding when to use each approach
- Function calling and tool integration
- Context engineering fundamentals
- Design patterns for AI solutions
- Model Context Protocol (MCP) overview
These documents will provide you with the theoretical foundation and architectural understanding needed to make the most of the tutorials.
- Python 3: Make sure Python 3 is installed on your system
- Virtual Environment: Create and activate a virtual environment:
python3 -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
- Dependencies: Install required packages:
pip install -r requirements.txt
- Environment Configuration: Copy
.env.dummy
to.env
and fill in your Azure OpenAI credentials:cp .env.dummy .env
# | Tutorial Files | Purpose | Description |
---|---|---|---|
1 | 01_ask_question_get_ans_azure_api.py 01_ask_question_get_ans_azure_api.ipynb |
Ask a question and get an answer via Azure OpenAI library | Uses the Azure OpenAI Python library to demonstrate basic question-answer interactions. |
2 | 02_message_roles.py 02_message_roles.ipynb |
Understanding Message Roles and Input Formats | Explains the different ways to send input to Azure OpenAI models, and the various message roles (developer, user, assistant). |
3 | 03_conversational_chat.py 03_coversational_chat.ipynb |
Conversational Chat with Azure OpenAI | Builds upon single question-answer interactions to create a conversational chatbot. The AI maintains context throughout the conversation using message history. |
4 | 04_conversational_chat_with_token_limit_handling.py 04_coversational_chat_with_token_limit_handling.ipynb |
Conversational Chat with Token Limit Handling | Addresses the challenge of growing conversation history consuming more tokens. Implements smart token limit handling and conversation pruning mechanisms. |
5 | 05_server_side_conversation_management.py 05_server_side_conversation_management.ipynb |
Server-Side Conversation Management | Demonstrates how to manage conversations on the server side |
6 | 06_few_shot_prompting.py 06_few_shot_prompting.ipynb |
Few-Shot Prompting | Demonstrates few-shot prompting technique |
7 | 07_streaming_responses.py 07_streaming_responses.ipynb |
Streaming Responses | Learn how to provide immediate feedback to users as the AI generates responses. |
8 | 08_chatbot_for_document.py 08_chatbot_for_document.ipynb |
Chatbot for Document | Implements a chatbot that can answer questions by referencing a specific document |
9 | 09_structured_outputs.py 09_structured_outputs.ipynb |
Structured Outputs | Shows how to generate structured JSON outputs from AI models |
10 | 10_function_calling.py 10_function_calling.ipynb |
Function Calling | Demonstrates how to extend AI capabilities by allowing models to call external functions |
11 | 11_code_interpreter.py 11_code_interpreter.ipynb |
Code Interpreter | Direct LLMs to write python scripts to solve mathematical and statistical problems and accurately perform data analysis |
Complete tutorial series for running AI models locally using Ollama:
# | Tutorial Files | Purpose | Description |
---|---|---|---|
1 | 01_ask_question_get_ans_ollama.* |
Basic Ollama Usage | Getting started with Ollama for local AI model execution |
2 | 02_conversational_chat_ollama.* |
Conversational AI | Building chatbots with locally running Ollama models |
3 | 03_few_shot_prompting_ollama.* |
Advanced Prompting | Few-shot prompting techniques with Ollama |
4 | 04_thinking_model_ollama.* |
Advanced Reasoning | Using thinking/reasoning models with Ollama |
5 | 05_streaming_ollama.* |
Real-time Responses | Implementing streaming responses with Ollama |
6 | 06_thinking_levels_ollama.* |
Advanced Reasoning | Different levels of thinking and reasoning with Ollama models |
7 | 07_structured_outputs_ollama.* |
Structured Output | Generating structured JSON outputs with Ollama |
8 | 08_function_calling_ollama.* |
Function Calling | Implementing function calling capabilities with Ollama |
9 | 09_remote_ollama.* |
Remote Access | Connecting to and using remote Ollama instances |
Tutorials for implementing MCP servers and integrating with AI systems:
# | Tutorial Files | Purpose | Description |
---|---|---|---|
1 | 01_local-mcp-server-fastmcp.py |
Local MCP Server | Creating a local MCP server using FastMCP library |
2 | 02_http-mcp-server-fastmcp.py |
HTTP MCP Server | Implementing an HTTP-based MCP server for remote access |
3 | 03_run_with_docker.md |
Docker Deployment | Guide for running MCP servers in Docker containers |
-
LLMs_explained.md
: Essential reading that provides a deep dive into how Large Language Models work, including:- Tokens and embeddings
- Transformer architecture
- Training and fine-tuning processes
- Model behavior and limitations
-
ai_solution_building_guide.md
: Must-read guide for designing AI solutions, covering:- Workflows vs Agents architecture patterns
- Function calling and tool integration
- Context engineering best practices
- Model Context Protocol (MCP) implementation
- DevOps considerations and challenges
requirements.txt
: Lists all Python dependencies needed for the tutorials.env.dummy
: Template for environment variables configuration (rename to.env
and fill with your credentials).gitignore
: Prevents sensitive files and local development artifacts from being committedtest_document.txt
: Sample document used in document-based chatbot tutorialsdummy_build_data.json
: Sample data file used in code interpreter examplesimages/
: Folder containing diagrams and screenshots used in documentation13_ollama/
: Complete tutorial series for using Ollama to run AI models locally12_mcp/
: Tutorials and examples for implementing Model Context Protocol servers
- Understanding LLMs (
LLMs_explained.md
): Master the fundamentals of how Large Language Models work, from tokenization to transformer architecture. This knowledge is crucial for making informed decisions about model selection and implementation.
- Building AI Solutions (
ai_solution_building_guide.md
): Learn proven patterns and practices for creating reliable AI systems:- Workflows vs. Agents: Choose the right architecture for your use case
- Design Patterns: Industry-tested patterns for building reliable AI systems
- Function Calling: Extend AI capabilities through tool integration
- Context Engineering: Master the art of providing the right context
- MCP Integration: Implement standardized tool communication
- Azure OpenAI: All core tutorials (01-10) demonstrate Azure OpenAI integration with proper authentication and configuration
- Ollama: Local AI development with privacy and control (13_ollama/ series)
- Model Context Protocol: Building extensible AI systems that can integrate with various tools and services (12_mcp/ series)
Each tutorial file contains detailed comments explaining the concepts and implementation. The Jupyter notebook versions provide an interactive learning experience with explanations and outputs.
azure-open-ai/
├── Core Azure OpenAI Tutorials (01-11)
│ ├── 01_ask_question_get_ans_azure_api.*
│ ├── 02_message_roles.*
│ ├── 03_conversational_chat.*
│ ├── 04_conversational_chat_with_token_limit_handling.*
│ ├── 05_server_side_conversation_management.*
│ ├── 06_few_shot_prompting.*
│ ├── 07_streaming_responses.*
│ ├── 08_chatbot_for_document.*
│ ├── 09_structured_outputs.*
│ ├── 10_function_calling.*
│ └── 11_code_interpreter.*
├── 12_mcp/ # Model Context Protocol
│ ├── 01_local-mcp-server-fastmcp.py
│ ├── 02_http-mcp-server-fastmcp.py
│ ├── 03_run_with_docker.md
│ ├── Dockerfile
│ ├── README.md
│ ├── requirements.txt
│ └── screenshots/
├── 13_ollama/ # Local AI development with Ollama
│ ├── 01_ask_question_get_ans_ollama.*
│ ├── 02_conversational_chat_ollama.*
│ ├── 03_few_shot_prompting_ollama.*
│ ├── 04_thinking_model_ollama.*
│ ├── 05_streaming_ollama.*
│ ├── 06_thinking_levels_ollama.*
│ ├── 07_structured_outputs_ollama.*
│ ├── 08_function_calling_ollama.*
│ ├── 09_remote_ollama.*
│ ├── README.md
│ └── requirements.txt
├── images/ # Documentation assets and diagrams
├── ai_solution_building_guide.md # Architecture and design guide
├── requirements.txt # Main dependencies
├── test_document.txt # Sample document for tutorials
├── dummy_build_data.json # Sample data for examples
└── .env.dummy # Configuration template