diff --git a/.github/workflows/docker.yaml b/.github/workflows/docker.yaml index 7231e08..1c5a2ee 100644 --- a/.github/workflows/docker.yaml +++ b/.github/workflows/docker.yaml @@ -8,8 +8,9 @@ on: - "v*" paths-ignore: - "**/*.md" - - ".github/workflows/ruff.yaml" - - ".github/workflows/track.yml" + - ".github/*" + - "LICENSE" + - ".gitignore" env: REGISTRY: ghcr.io @@ -26,6 +27,9 @@ jobs: - name: Checkout repository uses: actions/checkout@v6 + - name: Set up QEMU + uses: docker/setup-qemu-action@v3 + - name: Set up Docker Buildx uses: docker/setup-buildx-action@v3 @@ -46,6 +50,7 @@ jobs: type=semver,pattern={{version}} type=semver,pattern={{major}}.{{minor}} type=semver,pattern={{major}} + type=raw,value={{date 'YYYYMMDD'}}-{{sha}} type=raw,value=latest,enable={{is_default_branch}} - name: Build and push Docker image diff --git a/.github/workflows/ruff.yaml b/.github/workflows/ruff.yaml index 6b9e536..5e13127 100644 --- a/.github/workflows/ruff.yaml +++ b/.github/workflows/ruff.yaml @@ -19,12 +19,12 @@ jobs: - name: Set up Python uses: actions/setup-python@v6 with: - python-version: "3.12" + python-version: "3.13" - name: Install Ruff run: | python -m pip install --upgrade pip - pip install "ruff>=0.11.7" + pip install ruff - name: Run Ruff run: ruff check . diff --git a/Dockerfile b/Dockerfile index 938bc2f..62ce9d1 100644 --- a/Dockerfile +++ b/Dockerfile @@ -1,16 +1,27 @@ -FROM ghcr.io/astral-sh/uv:python3.12-bookworm-slim +FROM ghcr.io/astral-sh/uv:python3.13-trixie-slim -LABEL org.opencontainers.image.description="Web-based Gemini models wrapped into an OpenAI-compatible API." +LABEL org.opencontainers.image.title="Gemini-FastAPI" \ + org.opencontainers.image.description="Web-based Gemini models wrapped into an OpenAI-compatible API." WORKDIR /app -# Install dependencies +RUN apt-get update && apt-get install -y --no-install-recommends \ + tini \ + && rm -rf /var/lib/apt/lists/* + +ENV UV_COMPILE_BYTECODE=1 \ + PYTHONUNBUFFERED=1 \ + PYTHONDONTWRITEBYTECODE=1 + COPY pyproject.toml uv.lock ./ -RUN uv sync --no-cache --no-dev +RUN uv sync --no-cache --frozen --no-install-project --no-dev COPY app/ app/ COPY config/ config/ COPY run.py . -# Command to run the application +EXPOSE 8000 + +ENTRYPOINT ["/usr/bin/tini", "--"] + CMD ["uv", "run", "--no-dev", "run.py"] diff --git a/README.md b/README.md index 330e9c8..6b6f485 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ # Gemini-FastAPI -[![Python 3.12](https://img.shields.io/badge/python-3.12+-blue.svg)](https://www.python.org/downloads/) +[![Python 3.13](https://img.shields.io/badge/python-3.13+-blue.svg)](https://www.python.org/downloads/) [![FastAPI](https://img.shields.io/badge/FastAPI-0.115+-green.svg)](https://fastapi.tiangolo.com/) [![License](https://img.shields.io/badge/license-MIT-blue.svg)](LICENSE) @@ -24,7 +24,7 @@ Web-based Gemini models wrapped into an OpenAI-compatible API. Powered by [Hanao ### Prerequisites -- Python 3.12 +- Python 3.13 - Google account with Gemini access on web - `secure_1psid` and `secure_1psidts` cookies from Gemini web interface @@ -74,6 +74,30 @@ python run.py The server will start on `http://localhost:8000` by default. +## API Endpoints + +The server provides several endpoints, including OpenAI-compatible ones. + +### OpenAI-Compatible Endpoints + +These endpoints are designed to be compatible with OpenAI's API structure, allowing you to use Gemini as a drop-in replacement. + +- **`GET /v1/models`**: Lists all supported Gemini models. +- **`POST /v1/chat/completions`**: Unified chat interface. + - **Streaming**: Set `stream: true` to receive real-time delta chunks. + - **Multi-modal**: Supports text, images, and file uploads. + - **Tool Calling**: Supports function calling via the `tools` parameter. + - **Structured Output**: Supports `response_format` for JSON schema enforcement. + +### Advanced Endpoints + +- **`POST /v1/responses`**: An alternative endpoint for complex interaction patterns, supporting rich output items including generated images and tool calls. + +### Utility Endpoints + +- **`GET /health`**: Health check endpoint. Returns the status of the server, configured Gemini clients, and conversation storage. +- **`GET /images/{filename}`**: Internal endpoint to serve generated images. Requires a valid token (automatically included in image URLs returned by the API). + ## Docker Deployment ### Run with Options diff --git a/README.zh.md b/README.zh.md index 2f9e1b5..d012d32 100644 --- a/README.zh.md +++ b/README.zh.md @@ -1,6 +1,6 @@ # Gemini-FastAPI -[![Python 3.12](https://img.shields.io/badge/python-3.12+-blue.svg)](https://www.python.org/downloads/) +[![Python 3.13](https://img.shields.io/badge/python-3.13+-blue.svg)](https://www.python.org/downloads/) [![FastAPI](https://img.shields.io/badge/FastAPI-0.115+-green.svg)](https://fastapi.tiangolo.com/) [![License](https://img.shields.io/badge/license-MIT-blue.svg)](LICENSE) @@ -24,7 +24,7 @@ ### 前置条件 -- Python 3.12 +- Python 3.13 - 拥有网页版 Gemini 访问权限的 Google 账号 - 从 Gemini 网页获取的 `secure_1psid` 和 `secure_1psidts` Cookie @@ -74,6 +74,30 @@ python run.py 服务默认启动在 `http://localhost:8000`。 +## API 接口 + +本服务器提供了一系列接口,重点支持 OpenAI 兼容协议。 + +### OpenAI 兼容接口 + +这些接口遵循 OpenAI 的 API 规范,允许你将 Gemini 作为 **Drop-in 替代方案** 直接接入现有的 AI 应用。 + +- **`GET /v1/models`**: 列出所有可用的 Gemini 模型。 +- **`POST /v1/chat/completions`**: 统一聊天对话接口。 + - **流式传输**: 设置 `stream: true` 即可实时接收增量响应 (Stream Delta)。 + - **多模态支持**: 支持在消息中包含文本、图片以及文件上传。 + - **工具调用**: 支持通过 `tools` 参数进行函数调用 (Function Calling)。 + - **结构化输出**: 支持 `response_format`,可严格遵循 JSON Schema。 + +### 高级接口 + +- **`POST /v1/responses`**: 用于复杂交互模式的专用接口,支持分步输出、生成图片及工具调用等更丰富的响应项。 + +### 辅助与系统接口 + +- **`GET /health`**: 健康检查接口。返回服务器运行状态、已配置的 Gemini 客户端健康度以及对话存储统计信息。 +- **`GET /images/{filename}`**: 用于访问生成的图片的内部接口。需携带有效 Token(API 返回的图片 URL 中已自动包含该 Token)。 + ## Docker 部署 ### 直接运行 diff --git a/app/main.py b/app/main.py index f4e6711..20d15b0 100644 --- a/app/main.py +++ b/app/main.py @@ -2,7 +2,6 @@ from contextlib import asynccontextmanager from fastapi import FastAPI -from fastapi.responses import ORJSONResponse from loguru import logger from .server.chat import router as chat_router @@ -43,7 +42,7 @@ async def _run_retention_cleanup(stop_event: asyncio.Event) -> None: stop_event.wait(), timeout=RETENTION_CLEANUP_INTERVAL_SECONDS, ) - except asyncio.TimeoutError: + except TimeoutError: continue logger.info("LMDB retention cleanup task stopped.") @@ -93,7 +92,6 @@ def create_app() -> FastAPI: description="OpenAI-compatible API for Gemini Web", version="1.0.0", lifespan=lifespan, - default_response_class=ORJSONResponse, ) add_cors_middleware(app) diff --git a/app/models/__init__.py b/app/models/__init__.py index c6a3640..1378f1f 100644 --- a/app/models/__init__.py +++ b/app/models/__init__.py @@ -1 +1,65 @@ -from .models import * # noqa: F403 +from .models import ( + ChatCompletionRequest, + ChatCompletionResponse, + Choice, + ContentItem, + ConversationInStore, + FunctionCall, + HealthCheckResponse, + Message, + ModelData, + ModelListResponse, + ResponseCreateRequest, + ResponseCreateResponse, + ResponseImageGenerationCall, + ResponseImageTool, + ResponseInputContent, + ResponseInputItem, + ResponseOutputContent, + ResponseOutputMessage, + ResponseReasoning, + ResponseReasoningContentPart, + ResponseSummaryPart, + ResponseToolCall, + ResponseToolChoice, + ResponseUsage, + Tool, + ToolCall, + ToolChoiceFunction, + ToolChoiceFunctionDetail, + ToolFunctionDefinition, + Usage, +) + +__all__ = [ + "ChatCompletionRequest", + "ChatCompletionResponse", + "Choice", + "ContentItem", + "ConversationInStore", + "FunctionCall", + "HealthCheckResponse", + "Message", + "ModelData", + "ModelListResponse", + "ResponseCreateRequest", + "ResponseCreateResponse", + "ResponseImageGenerationCall", + "ResponseImageTool", + "ResponseInputContent", + "ResponseInputItem", + "ResponseOutputContent", + "ResponseOutputMessage", + "ResponseReasoning", + "ResponseReasoningContentPart", + "ResponseSummaryPart", + "ResponseToolCall", + "ResponseToolChoice", + "ResponseUsage", + "Tool", + "ToolCall", + "ToolChoiceFunction", + "ToolChoiceFunctionDetail", + "ToolFunctionDefinition", + "Usage", +] diff --git a/app/models/models.py b/app/models/models.py index 64ceaa9..8dbe86c 100644 --- a/app/models/models.py +++ b/app/models/models.py @@ -1,7 +1,7 @@ from __future__ import annotations from datetime import datetime -from typing import Any, Dict, List, Literal, Optional, Union +from typing import Any, Literal from pydantic import BaseModel, Field, model_validator @@ -10,28 +10,28 @@ class ContentItem(BaseModel): """Individual content item (text, image, or file) within a message.""" type: Literal["text", "image_url", "file", "input_audio"] - text: Optional[str] = None - image_url: Optional[Dict[str, str]] = None - input_audio: Optional[Dict[str, Any]] = None - file: Optional[Dict[str, str]] = None - annotations: List[Dict[str, Any]] = Field(default_factory=list) + text: str | None = Field(default=None) + image_url: dict[str, Any] | None = Field(default=None) + input_audio: dict[str, Any] | None = Field(default=None) + file: dict[str, Any] | None = Field(default=None) + annotations: list[dict[str, Any]] = Field(default_factory=list) class Message(BaseModel): """Message model""" role: str - content: Union[str, List[ContentItem], None] = None - name: Optional[str] = None - tool_calls: Optional[List["ToolCall"]] = None - tool_call_id: Optional[str] = None - refusal: Optional[str] = None - reasoning_content: Optional[str] = None - audio: Optional[Dict[str, Any]] = None - annotations: List[Dict[str, Any]] = Field(default_factory=list) + content: str | list[ContentItem] | None = Field(default=None) + name: str | None = Field(default=None) + tool_calls: list[ToolCall] | None = Field(default=None) + tool_call_id: str | None = Field(default=None) + refusal: str | None = Field(default=None) + reasoning_content: str | None = Field(default=None) + audio: dict[str, Any] | None = Field(default=None) + annotations: list[dict[str, Any]] = Field(default_factory=list) @model_validator(mode="after") - def normalize_role(self) -> "Message": + def normalize_role(self) -> Message: """Normalize 'developer' role to 'system' for Gemini compatibility.""" if self.role == "developer": self.role = "system" @@ -44,7 +44,7 @@ class Choice(BaseModel): index: int message: Message finish_reason: str - logprobs: Optional[Dict[str, Any]] = None + logprobs: dict[str, Any] | None = Field(default=None) class FunctionCall(BaseModel): @@ -66,8 +66,8 @@ class ToolFunctionDefinition(BaseModel): """Function definition for tool.""" name: str - description: Optional[str] = None - parameters: Optional[Dict[str, Any]] = None + description: str | None = Field(default=None) + parameters: dict[str, Any] | None = Field(default=None) class Tool(BaseModel): @@ -96,8 +96,8 @@ class Usage(BaseModel): prompt_tokens: int completion_tokens: int total_tokens: int - prompt_tokens_details: Optional[Dict[str, int]] = None - completion_tokens_details: Optional[Dict[str, int]] = None + prompt_tokens_details: dict[str, int] | None = Field(default=None) + completion_tokens_details: dict[str, int] | None = Field(default=None) class ModelData(BaseModel): @@ -113,17 +113,17 @@ class ChatCompletionRequest(BaseModel): """Chat completion request model""" model: str - messages: List[Message] - stream: Optional[bool] = False - user: Optional[str] = None - temperature: Optional[float] = 0.7 - top_p: Optional[float] = 1.0 - max_tokens: Optional[int] = None - tools: Optional[List["Tool"]] = None - tool_choice: Optional[ - Union[Literal["none"], Literal["auto"], Literal["required"], "ToolChoiceFunction"] - ] = None - response_format: Optional[Dict[str, Any]] = None + messages: list[Message] + stream: bool | None = Field(default=False) + user: str | None = Field(default=None) + temperature: float | None = Field(default=0.7) + top_p: float | None = Field(default=1.0) + max_tokens: int | None = Field(default=None) + tools: list[Tool] | None = Field(default=None) + tool_choice: ( + Literal["none"] | Literal["auto"] | Literal["required"] | ToolChoiceFunction | None + ) = Field(default=None) + response_format: dict[str, Any] | None = Field(default=None) class ChatCompletionResponse(BaseModel): @@ -133,7 +133,7 @@ class ChatCompletionResponse(BaseModel): object: str = "chat.completion" created: int model: str - choices: List[Choice] + choices: list[Choice] usage: Usage @@ -141,23 +141,23 @@ class ModelListResponse(BaseModel): """Model list model""" object: str = "list" - data: List[ModelData] + data: list[ModelData] class HealthCheckResponse(BaseModel): """Health check response model""" ok: bool - storage: Optional[Dict[str, str | int]] = None - clients: Optional[Dict[str, bool]] = None - error: Optional[str] = None + storage: dict[str, Any] | None = Field(default=None) + clients: dict[str, bool] | None = Field(default=None) + error: str | None = Field(default=None) class ConversationInStore(BaseModel): """Conversation model for storing in the database.""" - created_at: Optional[datetime] = Field(default=None) - updated_at: Optional[datetime] = Field(default=None) + created_at: datetime | None = Field(default=None) + updated_at: datetime | None = Field(default=None) # Gemini Web API does not support changing models once a conversation is created. model: str = Field(..., description="Model used for the conversation") @@ -171,63 +171,55 @@ class ConversationInStore(BaseModel): class ResponseInputContent(BaseModel): """Content item for Responses API input.""" - type: Literal["input_text", "input_image", "input_file"] - text: Optional[str] = None - image_url: Optional[str] = None - detail: Optional[Literal["auto", "low", "high"]] = None - file_url: Optional[str] = None - file_data: Optional[str] = None - filename: Optional[str] = None - annotations: List[Dict[str, Any]] = Field(default_factory=list) - - @model_validator(mode="before") - @classmethod - def normalize_output_text(cls, data: Any) -> Any: - """Allow output_text (from previous turns) to be treated as input_text.""" - if isinstance(data, dict) and data.get("type") == "output_text": - data["type"] = "input_text" - return data + type: Literal["input_text", "output_text", "reasoning_text", "input_image", "input_file"] + text: str | None = Field(default=None) + image_url: str | None = Field(default=None) + detail: Literal["auto", "low", "high"] | None = Field(default=None) + file_url: str | None = Field(default=None) + file_data: str | None = Field(default=None) + filename: str | None = Field(default=None) + annotations: list[dict[str, Any]] = Field(default_factory=list) class ResponseInputItem(BaseModel): """Single input item for Responses API.""" - type: Optional[Literal["message"]] = "message" + type: Literal["message"] | None = Field(default="message") role: Literal["user", "assistant", "system", "developer"] - content: Union[str, List[ResponseInputContent]] + content: str | list[ResponseInputContent] class ResponseToolChoice(BaseModel): """Tool choice enforcing a specific tool in Responses API.""" type: Literal["function", "image_generation"] - function: Optional[ToolChoiceFunctionDetail] = None + function: ToolChoiceFunctionDetail | None = Field(default=None) class ResponseImageTool(BaseModel): """Image generation tool specification for Responses API.""" type: Literal["image_generation"] - model: Optional[str] = None - output_format: Optional[str] = None + model: str | None = Field(default=None) + output_format: str | None = Field(default=None) class ResponseCreateRequest(BaseModel): """Responses API request payload.""" model: str - input: Union[str, List[ResponseInputItem]] - instructions: Optional[Union[str, List[ResponseInputItem]]] = None - temperature: Optional[float] = 0.7 - top_p: Optional[float] = 1.0 - max_output_tokens: Optional[int] = None - stream: Optional[bool] = False - tool_choice: Optional[Union[str, ResponseToolChoice]] = None - tools: Optional[List[Union[Tool, ResponseImageTool]]] = None - store: Optional[bool] = None - user: Optional[str] = None - response_format: Optional[Dict[str, Any]] = None - metadata: Optional[Dict[str, Any]] = None + input: str | list[ResponseInputItem] + instructions: str | list[ResponseInputItem] | None = Field(default=None) + temperature: float | None = Field(default=0.7) + top_p: float | None = Field(default=1.0) + max_output_tokens: int | None = Field(default=None) + stream: bool | None = Field(default=False) + tool_choice: str | ResponseToolChoice | None = Field(default=None) + tools: list[Tool | ResponseImageTool] | None = Field(default=None) + store: bool | None = Field(default=None) + user: str | None = Field(default=None) + response_format: dict[str, Any] | None = Field(default=None) + metadata: dict[str, Any] | None = Field(default=None) class ResponseUsage(BaseModel): @@ -236,14 +228,16 @@ class ResponseUsage(BaseModel): input_tokens: int output_tokens: int total_tokens: int + input_tokens_details: dict[str, int] | None = Field(default=None) + output_tokens_details: dict[str, int] | None = Field(default=None) class ResponseOutputContent(BaseModel): """Content item for Responses API output.""" type: Literal["output_text"] - text: Optional[str] = "" - annotations: List[Dict[str, Any]] = Field(default_factory=list) + text: str | None = Field(default="") + annotations: list[dict[str, Any]] = Field(default_factory=list) class ResponseOutputMessage(BaseModel): @@ -252,27 +246,53 @@ class ResponseOutputMessage(BaseModel): id: str type: Literal["message"] role: Literal["assistant"] - content: List[ResponseOutputContent] + content: list[ResponseOutputContent] + + +class ResponseSummaryPart(BaseModel): + """Summary part for reasoning.""" + + type: Literal["summary_text"] = Field(default="summary_text") + text: str + + +class ResponseReasoningContentPart(BaseModel): + """Content part for reasoning.""" + + type: Literal["reasoning_text"] = Field(default="reasoning_text") + text: str + + +class ResponseReasoning(BaseModel): + """Reasoning item returned by Responses API.""" + + id: str + type: Literal["reasoning"] = Field(default="reasoning") + status: Literal["in_progress", "completed", "incomplete"] = Field(default="completed") + summary: list[ResponseSummaryPart] | None = Field(default=None) + content: list[ResponseReasoningContentPart] | None = Field(default=None) class ResponseImageGenerationCall(BaseModel): """Image generation call record emitted in Responses API.""" id: str - type: Literal["image_generation_call"] = "image_generation_call" - status: Literal["completed", "in_progress", "generating", "failed"] = "completed" - result: Optional[str] = None - output_format: Optional[str] = None - size: Optional[str] = None - revised_prompt: Optional[str] = None + type: Literal["image_generation_call"] = Field(default="image_generation_call") + status: Literal["completed", "in_progress", "generating", "failed"] = Field(default="completed") + result: str | None = Field(default=None) + output_format: str | None = Field(default=None) + size: str | None = Field(default=None) + revised_prompt: str | None = Field(default=None) class ResponseToolCall(BaseModel): """Tool call record emitted in Responses API.""" id: str - type: Literal["tool_call"] = "tool_call" - status: Literal["in_progress", "completed", "failed", "requires_action"] = "completed" + type: Literal["tool_call"] = Field(default="tool_call") + status: Literal["in_progress", "completed", "failed", "requires_action"] = Field( + default="completed" + ) function: FunctionCall @@ -280,10 +300,12 @@ class ResponseCreateResponse(BaseModel): """Responses API response payload.""" id: str - object: Literal["response"] = "response" + object: Literal["response"] = Field(default="response") created_at: int model: str - output: List[Union[ResponseOutputMessage, ResponseImageGenerationCall, ResponseToolCall]] + output: list[ + ResponseReasoning | ResponseOutputMessage | ResponseImageGenerationCall | ResponseToolCall + ] status: Literal[ "in_progress", "completed", @@ -291,13 +313,13 @@ class ResponseCreateResponse(BaseModel): "incomplete", "cancelled", "requires_action", - ] = "completed" - tool_choice: Optional[Union[str, ResponseToolChoice]] = None - tools: Optional[List[Union[Tool, ResponseImageTool]]] = None + ] = Field(default="completed") + tool_choice: str | ResponseToolChoice | None = Field(default=None) + tools: list[Tool | ResponseImageTool] | None = Field(default=None) usage: ResponseUsage - error: Optional[Dict[str, Any]] = None - metadata: Optional[Dict[str, Any]] = None - input: Optional[Union[str, List[ResponseInputItem]]] = None + error: dict[str, Any] | None = Field(default=None) + metadata: dict[str, Any] | None = Field(default=None) + input: str | list[ResponseInputItem] | None = Field(default=None) # Rebuild models with forward references diff --git a/app/server/chat.py b/app/server/chat.py index 934091b..089db7c 100644 --- a/app/server/chat.py +++ b/app/server/chat.py @@ -3,10 +3,11 @@ import io import reprlib import uuid +from collections.abc import AsyncGenerator from dataclasses import dataclass -from datetime import datetime, timezone +from datetime import UTC, datetime from pathlib import Path -from typing import Any, AsyncGenerator +from typing import Any import orjson from fastapi import APIRouter, Depends, HTTPException, Request, status @@ -17,7 +18,7 @@ from gemini_webapi.types.image import GeneratedImage, Image from loguru import logger -from ..models import ( +from app.models import ( ChatCompletionRequest, ContentItem, ConversationInStore, @@ -32,17 +33,25 @@ ResponseInputItem, ResponseOutputContent, ResponseOutputMessage, + ResponseReasoning, + ResponseReasoningContentPart, ResponseToolCall, ResponseToolChoice, ResponseUsage, Tool, ToolChoiceFunction, ) -from ..services import GeminiClientPool, GeminiClientWrapper, LMDBConversationStore -from ..utils import g_config -from ..utils.helper import ( - TOOL_HINT_LINE_END, - TOOL_HINT_LINE_START, +from app.server.middleware import ( + get_image_store_dir, + get_image_token, + get_temp_dir, + verify_api_key, +) +from app.services import GeminiClientPool, GeminiClientWrapper, LMDBConversationStore +from app.utils import g_config +from app.utils.helper import ( + STREAM_MASTER_RE, + STREAM_TAIL_RE, TOOL_HINT_STRIPPED, TOOL_WRAP_HINT, detect_image_extension, @@ -53,7 +62,6 @@ strip_system_hints, text_from_message, ) -from .middleware import get_image_store_dir, get_image_token, get_temp_dir, verify_api_key MAX_CHARS_PER_REQUEST = int(g_config.gemini.max_chars_per_request * 0.9) METADATA_TTL_MINUTES = 15 @@ -98,11 +106,7 @@ async def _image_to_base64( if not suffix: detected_ext = detect_image_extension(data) - if detected_ext: - suffix = detected_ext - else: - # Fallback if detection fails - suffix = ".png" if isinstance(image, GeneratedImage) else ".jpg" + suffix = detected_ext or (".png" if isinstance(image, GeneratedImage) else ".jpg") random_name = f"img_{uuid.uuid4().hex}{suffix}" new_path = temp_dir / random_name @@ -118,8 +122,9 @@ def _calculate_usage( messages: list[Message], assistant_text: str | None, tool_calls: list[Any] | None, -) -> tuple[int, int, int]: - """Calculate prompt, completion and total tokens consistently.""" + thoughts: str | None = None, +) -> tuple[int, int, int, int]: + """Calculate prompt, completion, total and reasoning tokens consistently.""" prompt_tokens = sum(estimate_tokens(text_from_message(msg)) for msg in messages) tool_args_text = "" if tool_calls: @@ -136,7 +141,15 @@ def _calculate_usage( ) completion_tokens = estimate_tokens(completion_basis) - return prompt_tokens, completion_tokens, prompt_tokens + completion_tokens + reasoning_tokens = estimate_tokens(thoughts) if thoughts else 0 + total_completion_tokens = completion_tokens + reasoning_tokens + + return ( + prompt_tokens, + total_completion_tokens, + prompt_tokens + total_completion_tokens, + reasoning_tokens, + ) def _create_responses_standard_payload( @@ -149,34 +162,50 @@ def _create_responses_standard_payload( usage: ResponseUsage, request: ResponseCreateRequest, normalized_input: Any, + full_thoughts: str | None = None, ) -> ResponseCreateResponse: """Unified factory for building ResponseCreateResponse objects.""" message_id = f"msg_{uuid.uuid4().hex}" - tool_call_items: list[ResponseToolCall] = [] - if detected_tool_calls: - tool_call_items = [ - ResponseToolCall( - id=call.id if hasattr(call, "id") else call["id"], + reason_id = f"reason_{uuid.uuid4().hex}" + + output_items: list[Any] = [] + if full_thoughts: + output_items.append( + ResponseReasoning( + id=reason_id, status="completed", - function=call.function if hasattr(call, "function") else call["function"], + content=[ResponseReasoningContentPart(text=full_thoughts)], ) - for call in detected_tool_calls - ] + ) + + output_items.append( + ResponseOutputMessage( + id=message_id, + type="message", + role="assistant", + content=response_contents, + ) + ) + + if detected_tool_calls: + output_items.extend( + [ + ResponseToolCall( + id=call.id if hasattr(call, "id") else call["id"], + status="completed", + function=call.function if hasattr(call, "function") else call["function"], + ) + for call in detected_tool_calls + ] + ) + + output_items.extend(image_call_items) return ResponseCreateResponse( id=response_id, created_at=created_time, model=model_name, - output=[ - ResponseOutputMessage( - id=message_id, - type="message", - role="assistant", - content=response_contents, - ), - *tool_call_items, - *image_call_items, - ], + output=output_items, status="completed", usage=usage, input=normalized_input or None, @@ -194,6 +223,7 @@ def _create_chat_completion_standard_payload( tool_calls_payload: list[dict] | None, finish_reason: str, usage: dict, + reasoning_content: str | None = None, ) -> dict: """Unified factory for building Chat Completion response dictionaries.""" return { @@ -208,6 +238,7 @@ def _create_chat_completion_standard_payload( "role": "assistant", "content": visible_output or None, "tool_calls": tool_calls_payload or None, + "reasoning_content": reasoning_content or None, }, "finish_reason": finish_reason, } @@ -217,40 +248,41 @@ def _create_chat_completion_standard_payload( def _process_llm_output( - raw_output_with_think: str, - raw_output_clean: str, + thoughts: str | None, + raw_text: str, structured_requirement: StructuredOutputRequirement | None, -) -> tuple[str, str, list[Any]]: +) -> tuple[str | None, str, str, list[Any]]: """ Post-process Gemini output to extract tool calls and prepare clean text for display and storage. - Returns: (visible_text, storage_output, tool_calls) + Returns: (thoughts, visible_text, storage_output, tool_calls) """ - visible_with_think, tool_calls = extract_tool_calls(raw_output_with_think) + if thoughts: + thoughts = thoughts.strip() + + visible_output, tool_calls = extract_tool_calls(raw_text) if tool_calls: logger.debug(f"Detected {len(tool_calls)} tool call(s) in model output.") - visible_output = visible_with_think.strip() + visible_output = visible_output.strip() - storage_output = remove_tool_call_blocks(raw_output_clean) + storage_output = remove_tool_call_blocks(raw_text) storage_output = storage_output.strip() - if structured_requirement: - cleaned_for_json = LMDBConversationStore.remove_think_tags(visible_output) - if cleaned_for_json: - try: - structured_payload = orjson.loads(cleaned_for_json) - canonical_output = orjson.dumps(structured_payload).decode("utf-8") - visible_output = canonical_output - storage_output = canonical_output - logger.debug( - f"Structured response fulfilled (schema={structured_requirement.schema_name})." - ) - except orjson.JSONDecodeError: - logger.warning( - f"Failed to decode JSON for structured response (schema={structured_requirement.schema_name})." - ) + if structured_requirement and visible_output: + try: + structured_payload = orjson.loads(visible_output) + canonical_output = orjson.dumps(structured_payload).decode("utf-8") + visible_output = canonical_output + storage_output = canonical_output + logger.debug( + f"Structured response fulfilled (schema={structured_requirement.schema_name})." + ) + except orjson.JSONDecodeError: + logger.warning( + f"Failed to decode JSON for structured response (schema={structured_requirement.schema_name})." + ) - return visible_output, storage_output, tool_calls + return thoughts, visible_output, storage_output, tool_calls def _persist_conversation( @@ -261,6 +293,7 @@ def _persist_conversation( messages: list[Message], storage_output: str | None, tool_calls: list[Any] | None, + thoughts: str | None = None, ) -> str | None: """Unified logic to save conversation history to LMDB.""" try: @@ -268,6 +301,7 @@ def _persist_conversation( role="assistant", content=storage_output or None, tool_calls=tool_calls or None, + reasoning_content=thoughts or None, ) full_history = [*messages, current_assistant_message] cleaned_history = db.sanitize_messages(full_history) @@ -513,14 +547,22 @@ def _response_items_to_messages( messages.append(Message(role=role, content=content)) else: converted: list[ContentItem] = [] + reasoning_parts: list[str] = [] for part in content: - if part.type == "input_text": + if part.type in ("input_text", "output_text"): text_value = part.text or "" normalized_contents.append( - ResponseInputContent(type="input_text", text=text_value) + ResponseInputContent(type=part.type, text=text_value) ) if text_value: converted.append(ContentItem(type="text", text=text_value)) + elif part.type == "reasoning_text": + text_value = part.text or "" + normalized_contents.append( + ResponseInputContent(type="reasoning_text", text=text_value) + ) + if text_value: + reasoning_parts.append(text_value) elif part.type == "input_image": image_url = part.image_url if image_url: @@ -581,11 +623,16 @@ def _instructions_to_messages( instruction_messages.append(Message(role=role, content=content)) else: converted: list[ContentItem] = [] + reasoning_parts: list[str] = [] for part in content: - if part.type == "input_text": + if part.type in ("input_text", "output_text"): text_value = part.text or "" if text_value: converted.append(ContentItem(type="text", text=text_value)) + elif part.type == "reasoning_text": + text_value = part.text or "" + if text_value: + reasoning_parts.append(text_value) elif part.type == "input_image": image_url = part.image_url if image_url: @@ -607,7 +654,13 @@ def _instructions_to_messages( file_info["url"] = part.file_url if file_info: converted.append(ContentItem(type="file", file=file_info)) - instruction_messages.append(Message(role=role, content=converted or None)) + instruction_messages.append( + Message( + role=role, + content=converted or None, + reasoning_content="\n".join(reasoning_parts) if reasoning_parts else None, + ) + ) return instruction_messages @@ -628,7 +681,7 @@ def _get_model_by_name(name: str) -> Model: def _get_available_models() -> list[ModelData]: """Return a list of available models based on configuration strategy.""" - now = int(datetime.now(tz=timezone.utc).timestamp()) + now = int(datetime.now(tz=UTC).timestamp()) strategy = g_config.gemini.model_strategy models_data = [] @@ -712,7 +765,7 @@ async def _send_with_split( text: str, files: list[Path | str | io.BytesIO] | None = None, stream: bool = False, -) -> AsyncGenerator[ModelOutput, None] | ModelOutput: +) -> AsyncGenerator[ModelOutput] | ModelOutput: """Send text to Gemini, splitting or converting to attachment if too long.""" if len(text) <= MAX_CHARS_PER_REQUEST: try: @@ -749,277 +802,86 @@ async def _send_with_split( class StreamingOutputFilter: """ Filter to suppress technical protocol markers, tool calls, and system hints from the stream. - Uses a state machine to handle fragmentation where markers are split across multiple chunks. + Uses a stack-based state machine to handle nested fragmented markers. """ def __init__(self): self.buffer = "" - self.state = "NORMAL" + self.stack = ["NORMAL"] self.current_role = "" - self.block_buffer = "" - - self.STATE_MARKERS = { - "TOOL": { - "starts": ["[ToolCalls]", "\\[ToolCalls\\]"], - "ends": ["[/ToolCalls]", "\\[\\/ToolCalls\\]"], - }, - "ORPHAN": { - "starts": ["[Call:", "\\[Call\\:"], - "ends": ["[/Call]", "\\[\\/Call\\]"], - }, - "RESP": { - "starts": ["[ToolResults]", "\\[ToolResults\\]"], - "ends": ["[/ToolResults]", "\\[\\/ToolResults\\]"], - }, - "ARG": { - "starts": ["[CallParameter:", "\\[CallParameter\\:"], - "ends": ["[/CallParameter]", "\\[\\/CallParameter\\]"], - }, - "RESULT": { - "starts": ["[ToolResult]", "\\[ToolResult\\]"], - "ends": ["[/ToolResult]", "\\[\\/ToolResult\\]"], - }, - "ITEM": { - "starts": ["[Result:", "\\[Result\\:"], - "ends": ["[/Result]", "\\[\\/Result\\]"], - }, - "TAG": { - "starts": ["<|im_start|>", "\\<\\|im\\_start\\|\\>"], - "ends": ["<|im_end|>", "\\<\\|im\\_end\\|\\>"], - }, - } - hint_start = f"\n{TOOL_HINT_LINE_START}" if TOOL_HINT_LINE_START else "" - if hint_start: - self.STATE_MARKERS["HINT"] = { - "starts": [hint_start], - "ends": [TOOL_HINT_LINE_END], - } + @property + def state(self): + return self.stack[-1] - self.ORPHAN_ENDS = [ - "<|im_end|>", - "\\<\\|im\\_end\\|\\>", - "[/Call]", - "\\[\\/Call\\]", - "[/ToolCalls]", - "\\[\\/ToolCalls\\]", - "[/CallParameter]", - "\\[\\/CallParameter\\]", - "[/ToolResult]", - "\\[\\/ToolResult\\]", - "[/ToolResults]", - "\\[\\/ToolResults\\]", - "[/Result]", - "\\[\\/Result\\]", - ] - - self.WATCH_MARKERS = [] - for cfg in self.STATE_MARKERS.values(): - self.WATCH_MARKERS.extend(cfg["starts"]) - self.WATCH_MARKERS.extend(cfg.get("ends", [])) - self.WATCH_MARKERS.extend(self.ORPHAN_ENDS) + def _is_outputting(self) -> bool: + """Determines if the current state allows yielding text to the stream.""" + return self.state == "NORMAL" or (self.state == "IN_BLOCK" and self.current_role != "tool") def process(self, chunk: str) -> str: self.buffer += chunk output = [] while self.buffer: - buf_low = self.buffer.lower() - if self.state == "NORMAL": - indices = [] - for m_type, cfg in self.STATE_MARKERS.items(): - for p in cfg["starts"]: - idx = buf_low.find(p.lower()) - if idx != -1: - indices.append((idx, m_type, len(p))) - - for p in self.ORPHAN_ENDS: - idx = buf_low.find(p.lower()) - if idx != -1: - indices.append((idx, "SKIP", len(p))) - - if not indices: - keep_len = 0 - for marker in self.WATCH_MARKERS: - m_low = marker.lower() - for i in range(len(m_low) - 1, 0, -1): - if buf_low.endswith(m_low[:i]): - keep_len = max(keep_len, i) - break - yield_len = len(self.buffer) - keep_len - if yield_len > 0: - output.append(self.buffer[:yield_len]) - self.buffer = self.buffer[yield_len:] - break - - indices.sort() - idx, m_type, m_len = indices[0] - output.append(self.buffer[:idx]) - self.buffer = self.buffer[idx:] - - if m_type == "SKIP": - self.buffer = self.buffer[m_len:] + if self.state == "IN_TAG_HEADER": + nl_idx = self.buffer.find("\n") + if nl_idx != -1: + self.current_role = self.buffer[:nl_idx].strip().lower() + self.buffer = self.buffer[nl_idx + 1 :] + self.stack[-1] = "IN_BLOCK" continue - - self.state = f"IN_{m_type}" - if m_type in ("TOOL", "ORPHAN"): - self.block_buffer = "" - - self.buffer = self.buffer[m_len:] - - elif self.state == "IN_HINT": - cfg = self.STATE_MARKERS["HINT"] - found_idx, found_len = -1, 0 - for p in cfg["ends"]: - idx = buf_low.find(p.lower()) - if idx != -1 and (found_idx == -1 or idx < found_idx): - found_idx, found_len = idx, len(p) - - if found_idx != -1: - self.buffer = self.buffer[found_idx + found_len :] - self.state = "NORMAL" else: - max_end_len = max(len(p) for p in cfg["ends"]) - if len(self.buffer) > max_end_len: - self.buffer = self.buffer[-max_end_len:] break - elif self.state == "IN_ARG": - cfg = self.STATE_MARKERS["ARG"] - found_idx, found_len = -1, 0 - for p in cfg["ends"]: - idx = buf_low.find(p.lower()) - if idx != -1 and (found_idx == -1 or idx < found_idx): - found_idx, found_len = idx, len(p) - - if found_idx != -1: - self.buffer = self.buffer[found_idx + found_len :] - self.state = "NORMAL" - else: - max_end_len = max(len(p) for p in cfg["ends"]) - if len(self.buffer) > max_end_len: - self.buffer = self.buffer[-max_end_len:] - break + match = STREAM_MASTER_RE.search(self.buffer) + if not match: + tail_match = STREAM_TAIL_RE.search(self.buffer) + keep_len = len(tail_match.group(0)) if tail_match else 0 + yield_len = len(self.buffer) - keep_len + if yield_len > 0: + if self._is_outputting(): + output.append(self.buffer[:yield_len]) + self.buffer = self.buffer[yield_len:] + break - elif self.state == "IN_RESULT": - cfg = self.STATE_MARKERS["RESULT"] - found_idx, found_len = -1, 0 - for p in cfg["ends"]: - idx = buf_low.find(p.lower()) - if idx != -1 and (found_idx == -1 or idx < found_idx): - found_idx, found_len = idx, len(p) - - if found_idx != -1: - self.buffer = self.buffer[found_idx + found_len :] - self.state = "NORMAL" - else: - max_end_len = max(len(p) for p in cfg["ends"]) - if len(self.buffer) > max_end_len: - self.buffer = self.buffer[-max_end_len:] - break + start, end = match.span() + matched_group = match.lastgroup + pre_text = self.buffer[:start] - elif self.state == "IN_RESP": - cfg = self.STATE_MARKERS["RESP"] - found_idx, found_len = -1, 0 - for p in cfg["ends"]: - idx = buf_low.find(p.lower()) - if idx != -1 and (found_idx == -1 or idx < found_idx): - found_idx, found_len = idx, len(p) - - if found_idx != -1: - self.buffer = self.buffer[found_idx + found_len :] - self.state = "NORMAL" - else: - break + if self._is_outputting(): + output.append(pre_text) - elif self.state == "IN_TOOL": - cfg = self.STATE_MARKERS["TOOL"] - found_idx, found_len = -1, 0 - for p in cfg["ends"]: - idx = buf_low.find(p.lower()) - if idx != -1 and (found_idx == -1 or idx < found_idx): - found_idx, found_len = idx, len(p) - - if found_idx != -1: - self.block_buffer += self.buffer[:found_idx] - self.buffer = self.buffer[found_idx + found_len :] - self.state = "NORMAL" + if matched_group.endswith("_START"): + m_type = matched_group.split("_")[0] + if m_type == "TAG": + self.stack.append("IN_TAG_HEADER") else: - max_end_len = max(len(p) for p in cfg["ends"]) - if len(self.buffer) > max_end_len: - self.block_buffer += self.buffer[:-max_end_len] - self.buffer = self.buffer[-max_end_len:] - break - - elif self.state == "IN_ORPHAN": - cfg = self.STATE_MARKERS["ORPHAN"] - found_idx, found_len = -1, 0 - for p in cfg["ends"]: - idx = buf_low.find(p.lower()) - if idx != -1 and (found_idx == -1 or idx < found_idx): - found_idx, found_len = idx, len(p) - - if found_idx != -1: - self.block_buffer += self.buffer[:found_idx] - self.buffer = self.buffer[found_idx + found_len :] - self.state = "NORMAL" + self.stack.append(f"IN_{m_type}") + elif matched_group in ("PROTOCOL_EXIT", "TAG_EXIT", "HINT_EXIT"): + if len(self.stack) > 1: + self.stack.pop() else: - max_end_len = max(len(p) for p in cfg["ends"]) - if len(self.buffer) > max_end_len: - self.block_buffer += self.buffer[:-max_end_len] - self.buffer = self.buffer[-max_end_len:] - break + self.stack = ["NORMAL"] - elif self.state == "IN_TAG": - nl_idx = self.buffer.find("\n") - if nl_idx != -1: - self.current_role = self.buffer[:nl_idx].strip().lower() - self.buffer = self.buffer[nl_idx + 1 :] - self.state = "IN_BLOCK" - else: - break - - elif self.state == "IN_BLOCK": - cfg = self.STATE_MARKERS["TAG"] - found_idx, found_len = -1, 0 - for p in cfg["ends"]: - idx = buf_low.find(p.lower()) - if idx != -1 and (found_idx == -1 or idx < found_idx): - found_idx, found_len = idx, len(p) - - if found_idx != -1: - content = self.buffer[:found_idx] - if self.current_role != "tool": - output.append(content) - self.buffer = self.buffer[found_idx + found_len :] - self.state = "NORMAL" + if self.state == "NORMAL": self.current_role = "" - else: - max_end_len = max(len(p) for p in cfg["ends"]) - if self.current_role != "tool": - if len(self.buffer) > max_end_len: - output.append(self.buffer[:-max_end_len]) - self.buffer = self.buffer[-max_end_len:] - break - else: - if len(self.buffer) > max_end_len: - self.buffer = self.buffer[-max_end_len:] - break + + self.buffer = self.buffer[end:] return "".join(output) def flush(self) -> str: """Release remaining buffer content and perform final cleanup at stream end.""" res = "" - if self.state in ("IN_TOOL", "IN_ORPHAN", "IN_RESP", "IN_HINT", "IN_ARG", "IN_RESULT"): - res = "" - elif self.state == "IN_BLOCK" and self.current_role != "tool": - res = self.buffer - elif self.state == "NORMAL": + if self._is_outputting(): res = self.buffer + tail_match = STREAM_TAIL_RE.search(res) + if tail_match: + res = res[: -len(tail_match.group(0))] self.buffer = "" - self.state = "NORMAL" + self.stack = ["NORMAL"] + self.current_role = "" return strip_system_hints(res) @@ -1027,7 +889,7 @@ def flush(self) -> str: def _create_real_streaming_response( - generator: AsyncGenerator[ModelOutput, None], + generator: AsyncGenerator[ModelOutput], completion_id: str, created_time: int, model_name: str, @@ -1047,7 +909,6 @@ def _create_real_streaming_response( async def generate_stream(): full_thoughts, full_text = "", "" has_started = False - last_chunk_was_thought = False all_outputs: list[ModelOutput] = [] suppressor = StreamingOutputFilter() try: @@ -1067,8 +928,6 @@ async def generate_stream(): has_started = True if t_delta := chunk.thoughts_delta: - if not last_chunk_was_thought and not full_thoughts: - yield f"data: {orjson.dumps({'id': completion_id, 'object': 'chat.completion.chunk', 'created': created_time, 'model': model_name, 'choices': [{'index': 0, 'delta': {'content': ''}, 'finish_reason': None}]}).decode('utf-8')}\n\n" full_thoughts += t_delta data = { "id": completion_id, @@ -1076,16 +935,16 @@ async def generate_stream(): "created": created_time, "model": model_name, "choices": [ - {"index": 0, "delta": {"content": t_delta}, "finish_reason": None} + { + "index": 0, + "delta": {"reasoning_content": t_delta}, + "finish_reason": None, + } ], } yield f"data: {orjson.dumps(data).decode('utf-8')}\n\n" - last_chunk_was_thought = True if text_delta := chunk.text_delta: - if last_chunk_was_thought: - yield f"data: {orjson.dumps({'id': completion_id, 'object': 'chat.completion.chunk', 'created': created_time, 'model': model_name, 'choices': [{'index': 0, 'delta': {'content': '\n'}, 'finish_reason': None}]}).decode('utf-8')}\n\n" - last_chunk_was_thought = False full_text += text_delta if visible_delta := suppressor.process(text_delta): data = { @@ -1114,9 +973,6 @@ async def generate_stream(): if final_chunk.thoughts: full_thoughts = final_chunk.thoughts - if last_chunk_was_thought: - yield f"data: {orjson.dumps({'id': completion_id, 'object': 'chat.completion.chunk', 'created': created_time, 'model': model_name, 'choices': [{'index': 0, 'delta': {'content': '\n'}, 'finish_reason': None}]}).decode('utf-8')}\n\n" - if remaining_text := suppressor.flush(): data = { "id": completion_id, @@ -1129,10 +985,8 @@ async def generate_stream(): } yield f"data: {orjson.dumps(data).decode('utf-8')}\n\n" - raw_output_with_think = f"{full_thoughts}\n" if full_thoughts else "" - raw_output_with_think += full_text - assistant_text, storage_output, tool_calls = _process_llm_output( - raw_output_with_think, full_text, structured_requirement + _thoughts, assistant_text, storage_output, tool_calls = _process_llm_output( + full_thoughts, full_text, structured_requirement ) images = [] @@ -1193,8 +1047,15 @@ async def generate_stream(): } yield f"data: {orjson.dumps(data).decode('utf-8')}\n\n" - p_tok, c_tok, t_tok = _calculate_usage(messages, assistant_text, tool_calls) - usage = {"prompt_tokens": p_tok, "completion_tokens": c_tok, "total_tokens": t_tok} + p_tok, c_tok, t_tok, r_tok = _calculate_usage( + messages, assistant_text, tool_calls, full_thoughts + ) + usage = { + "prompt_tokens": p_tok, + "completion_tokens": c_tok, + "total_tokens": t_tok, + "completion_tokens_details": {"reasoning_tokens": r_tok}, + } data = { "id": completion_id, "object": "chat.completion.chunk", @@ -1210,9 +1071,10 @@ async def generate_stream(): model.model_name, client_wrapper.id, session.metadata, - messages, # This should be the prepared messages + messages, storage_output, tool_calls, + full_thoughts, ) yield f"data: {orjson.dumps(data).decode('utf-8')}\n\n" yield "data: [DONE]\n\n" @@ -1221,7 +1083,7 @@ async def generate_stream(): def _create_responses_real_streaming_response( - generator: AsyncGenerator[ModelOutput, None], + generator: AsyncGenerator[ModelOutput], response_id: str, created_time: int, model_name: str, @@ -1248,11 +1110,15 @@ def _create_responses_real_streaming_response( async def generate_stream(): yield f"data: {orjson.dumps({**base_event, 'type': 'response.created', 'response': {'id': response_id, 'object': 'response', 'created_at': created_time, 'model': model_name, 'status': 'in_progress', 'metadata': request.metadata, 'input': None, 'tools': request.tools, 'tool_choice': request.tool_choice}}).decode('utf-8')}\n\n" - message_id = f"msg_{uuid.uuid4().hex}" - yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_item.added', 'output_index': 0, 'item': {'id': message_id, 'type': 'message', 'role': 'assistant', 'content': []}}).decode('utf-8')}\n\n" full_thoughts, full_text = "", "" + thought_item_id = f"reason_{uuid.uuid4().hex}" + message_item_id = f"msg_{uuid.uuid4().hex}" + thought_item_added = False + message_item_added = False last_chunk_was_thought = False + current_idx = 0 + all_outputs: list[ModelOutput] = [] suppressor = StreamingOutputFilter() @@ -1260,18 +1126,31 @@ async def generate_stream(): async for chunk in generator: all_outputs.append(chunk) if t_delta := chunk.thoughts_delta: - if not last_chunk_was_thought and not full_thoughts: - yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_text.delta', 'output_index': 0, 'delta': ''}).decode('utf-8')}\n\n" + if not thought_item_added: + yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_item.added', 'output_index': current_idx, 'item': {'id': thought_item_id, 'type': 'reasoning', 'status': 'in_progress', 'content': []}}).decode('utf-8')}\n\n" + yield f"data: {orjson.dumps({**base_event, 'type': 'response.content_part.added', 'output_index': current_idx, 'part_index': 0, 'part': {'type': 'reasoning_text', 'text': ''}}).decode('utf-8')}\n\n" + thought_item_added = True + full_thoughts += t_delta - yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_text.delta', 'output_index': 0, 'delta': t_delta}).decode('utf-8')}\n\n" + yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_text.delta', 'output_index': current_idx, 'part_index': 0, 'delta': t_delta}).decode('utf-8')}\n\n" last_chunk_was_thought = True + if text_delta := chunk.text_delta: if last_chunk_was_thought: - yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_text.delta', 'output_index': 0, 'delta': '\n'}).decode('utf-8')}\n\n" + yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_text.done', 'output_index': current_idx, 'part_index': 0}).decode('utf-8')}\n\n" + yield f"data: {orjson.dumps({**base_event, 'type': 'response.content_part.done', 'output_index': current_idx, 'part_index': 0}).decode('utf-8')}\n\n" + yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_item.done', 'output_index': current_idx, 'item': {'id': thought_item_id, 'type': 'reasoning', 'status': 'completed', 'content': [{'type': 'reasoning_text', 'text': full_thoughts}]}}).decode('utf-8')}\n\n" + current_idx += 1 last_chunk_was_thought = False + + if not message_item_added: + yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_item.added', 'output_index': current_idx, 'item': {'id': message_item_id, 'type': 'message', 'role': 'assistant', 'content': []}}).decode('utf-8')}\n\n" + yield f"data: {orjson.dumps({**base_event, 'type': 'response.content_part.added', 'output_index': current_idx, 'part_index': 0, 'part': {'type': 'output_text', 'text': ''}}).decode('utf-8')}\n\n" + message_item_added = True + full_text += text_delta if visible_delta := suppressor.process(text_delta): - yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_text.delta', 'output_index': 0, 'delta': visible_delta}).decode('utf-8')}\n\n" + yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_text.delta', 'output_index': current_idx, 'part_index': 0, 'delta': visible_delta}).decode('utf-8')}\n\n" except Exception as e: logger.exception(f"Error during Responses API streaming: {e}") yield f"data: {orjson.dumps({**base_event, 'type': 'error', 'error': {'message': 'Streaming error.'}}).decode('utf-8')}\n\n" @@ -1285,17 +1164,33 @@ async def generate_stream(): full_thoughts = final_chunk.thoughts if last_chunk_was_thought: - yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_text.delta', 'output_index': 0, 'delta': '\n'}).decode('utf-8')}\n\n" - if remaining_text := suppressor.flush(): - yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_text.delta', 'output_index': 0, 'delta': remaining_text}).decode('utf-8')}\n\n" - yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_text.done', 'output_index': 0}).decode('utf-8')}\n\n" + yield f"data: {orjson.dumps({**base_event, 'type': 'response.content_part.done', 'output_index': current_idx, 'part_index': 0}).decode('utf-8')}\n\n" + yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_item.done', 'output_index': current_idx, 'item': {'id': thought_item_id, 'type': 'reasoning', 'status': 'completed', 'content': [{'type': 'reasoning_text', 'text': full_thoughts}]}}).decode('utf-8')}\n\n" + current_idx += 1 - raw_output_with_think = f"{full_thoughts}\n" if full_thoughts else "" - raw_output_with_think += full_text - assistant_text, storage_output, detected_tool_calls = _process_llm_output( - raw_output_with_think, full_text, structured_requirement + remaining_from_suppressor = suppressor.flush() + if remaining_from_suppressor: + if not message_item_added: + yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_item.added', 'output_index': current_idx, 'item': {'id': message_item_id, 'type': 'message', 'role': 'assistant', 'content': []}}).decode('utf-8')}\n\n" + yield f"data: {orjson.dumps({**base_event, 'type': 'response.content_part.added', 'output_index': current_idx, 'part_index': 0, 'part': {'type': 'output_text', 'text': ''}}).decode('utf-8')}\n\n" + message_item_added = True + yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_text.delta', 'output_index': current_idx, 'part_index': 0, 'delta': remaining_from_suppressor}).decode('utf-8')}\n\n" + + # IMPORTANT: Process output now to get the final assistant_text + _thoughts, assistant_text, storage_output, detected_tool_calls = _process_llm_output( + full_thoughts, full_text, structured_requirement ) + response_contents: list[ResponseOutputContent] = [] + if message_item_added: + yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_text.done', 'output_index': current_idx, 'part_index': 0}).decode('utf-8')}\n\n" + yield f"data: {orjson.dumps({**base_event, 'type': 'response.content_part.done', 'output_index': current_idx, 'part_index': 0}).decode('utf-8')}\n\n" + + msg_content = [{"type": "output_text", "text": assistant_text}] + yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_item.done', 'output_index': current_idx, 'item': {'id': message_item_id, 'type': 'message', 'role': 'assistant', 'content': msg_content}}).decode('utf-8')}\n\n" + response_contents.append(ResponseOutputContent(type="output_text", text=assistant_text)) + current_idx += 1 + images = [] seen_urls = set() for out in all_outputs: @@ -1305,7 +1200,7 @@ async def generate_stream(): images.append(img) seen_urls.add(img.url) - response_contents, image_call_items = [], [] + image_call_items: list[ResponseImageGenerationCall] = [] seen_hashes = set() for image in images: try: @@ -1321,25 +1216,20 @@ async def generate_stream(): img_id = fname img_format = "png" if isinstance(image, GeneratedImage) else "jpeg" - image_url = f"![{fname}]({base_url}images/{fname}?token={get_image_token(fname)})" - image_call_items.append( - ResponseImageGenerationCall( - id=img_id, - result=b64, - output_format=img_format, - size=f"{w}x{h}" if w and h else None, - ) + img_item = ResponseImageGenerationCall( + id=img_id, + result=b64, + output_format=img_format, + size=f"{w}x{h}" if w and h else None, ) - response_contents.append(ResponseOutputContent(type="output_text", text=image_url)) + image_call_items.append(img_item) + + yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_item.added', 'output_index': current_idx, 'item': img_item.model_dump(mode='json')}).decode('utf-8')}\n\n" + yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_item.done', 'output_index': current_idx, 'item': img_item.model_dump(mode='json')}).decode('utf-8')}\n\n" + current_idx += 1 except Exception as exc: logger.warning(f"Failed to process image in stream: {exc}") - if assistant_text: - response_contents.append(ResponseOutputContent(type="output_text", text=assistant_text)) - if not response_contents: - response_contents.append(ResponseOutputContent(type="output_text", text="")) - - # Aggregate images for storage image_markdown = "" for img_call in image_call_items: fname = f"{img_call.id}.{img_call.output_format}" @@ -1349,21 +1239,28 @@ async def generate_stream(): if image_markdown: storage_output += image_markdown - yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_item.done', 'output_index': 0, 'item': {'id': message_id, 'type': 'message', 'role': 'assistant', 'content': [c.model_dump(mode='json') for c in response_contents]}}).decode('utf-8')}\n\n" - - current_idx = 1 for call in detected_tool_calls: tc_item = ResponseToolCall(id=call.id, status="completed", function=call.function) yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_item.added', 'output_index': current_idx, 'item': tc_item.model_dump(mode='json')}).decode('utf-8')}\n\n" yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_item.done', 'output_index': current_idx, 'item': tc_item.model_dump(mode='json')}).decode('utf-8')}\n\n" current_idx += 1 - for img_call in image_call_items: - yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_item.added', 'output_index': current_idx, 'item': img_call.model_dump(mode='json')}).decode('utf-8')}\n\n" - yield f"data: {orjson.dumps({**base_event, 'type': 'response.output_item.done', 'output_index': current_idx, 'item': img_call.model_dump(mode='json')}).decode('utf-8')}\n\n" - current_idx += 1 - p_tok, c_tok, t_tok = _calculate_usage(messages, assistant_text, detected_tool_calls) - usage = ResponseUsage(input_tokens=p_tok, output_tokens=c_tok, total_tokens=t_tok) + p_tok, c_tok, t_tok, r_tok = _calculate_usage( + messages, assistant_text, detected_tool_calls, full_thoughts + ) + usage = ResponseUsage( + input_tokens=p_tok, + output_tokens=c_tok, + total_tokens=t_tok, + output_tokens_details={"reasoning_tokens": r_tok}, + ) + + # Ensure we have at least one content item if none was created + if not response_contents: + response_contents.append( + ResponseOutputContent(type="output_text", text=assistant_text or "") + ) + payload = _create_responses_standard_payload( response_id, created_time, @@ -1374,6 +1271,7 @@ async def generate_stream(): usage, request, None, + full_thoughts, ) _persist_conversation( db, @@ -1383,8 +1281,10 @@ async def generate_stream(): messages, storage_output, detected_tool_calls, + full_thoughts, ) yield f"data: {orjson.dumps({**base_event, 'type': 'response.completed', 'response': payload.model_dump(mode='json')}).decode('utf-8')}\n\n" + yield f"data: {orjson.dumps({**base_event, 'type': 'response.done'})}\n\n" yield "data: [DONE]\n\n" return StreamingResponse(generate_stream(), media_type="text/event-stream") @@ -1455,10 +1355,12 @@ async def create_chat_completion( m_input, files = await GeminiClientWrapper.process_conversation(msgs, tmp_dir) except Exception as e: logger.exception("Error in preparing conversation") - raise HTTPException(status_code=status.HTTP_503_SERVICE_UNAVAILABLE, detail=str(e)) + raise HTTPException( + status_code=status.HTTP_503_SERVICE_UNAVAILABLE, detail=str(e) + ) from e completion_id = f"chatcmpl-{uuid.uuid4()}" - created_time = int(datetime.now(tz=timezone.utc).timestamp()) + created_time = int(datetime.now(tz=UTC).timestamp()) try: assert session and client @@ -1470,7 +1372,7 @@ async def create_chat_completion( ) except Exception as e: logger.exception("Gemini API error") - raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(e)) + raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(e)) from e if request.stream: return _create_real_streaming_response( @@ -1488,7 +1390,7 @@ async def create_chat_completion( ) try: - raw_with_t = GeminiClientWrapper.extract_output(resp_or_stream, include_thoughts=True) + thoughts = resp_or_stream.thoughts raw_clean = GeminiClientWrapper.extract_output(resp_or_stream, include_thoughts=False) except Exception as exc: logger.exception("Gemini output parsing failed.") @@ -1496,8 +1398,8 @@ async def create_chat_completion( status_code=status.HTTP_502_BAD_GATEWAY, detail="Malformed response." ) from exc - visible_output, storage_output, tool_calls = _process_llm_output( - raw_with_t, raw_clean, structured_requirement + thoughts, visible_output, storage_output, tool_calls = _process_llm_output( + thoughts, raw_clean, structured_requirement ) # Process images for OpenAI non-streaming flow @@ -1525,8 +1427,15 @@ async def create_chat_completion( if tool_calls_payload: logger.debug(f"Detected tool calls: {reprlib.repr(tool_calls_payload)}") - p_tok, c_tok, t_tok = _calculate_usage(request.messages, visible_output, tool_calls) - usage = {"prompt_tokens": p_tok, "completion_tokens": c_tok, "total_tokens": t_tok} + p_tok, c_tok, t_tok, r_tok = _calculate_usage( + request.messages, visible_output, tool_calls, thoughts + ) + usage = { + "prompt_tokens": p_tok, + "completion_tokens": c_tok, + "total_tokens": t_tok, + "completion_tokens_details": {"reasoning_tokens": r_tok}, + } payload = _create_chat_completion_standard_payload( completion_id, created_time, @@ -1535,6 +1444,7 @@ async def create_chat_completion( tool_calls_payload, "tool_calls" if tool_calls else "stop", usage, + thoughts, ) _persist_conversation( db, @@ -1544,6 +1454,7 @@ async def create_chat_completion( msgs, # Use prepared messages 'msgs' storage_output, tool_calls, + thoughts, ) return payload @@ -1620,10 +1531,12 @@ async def create_response( m_input, files = await GeminiClientWrapper.process_conversation(messages, tmp_dir) except Exception as e: logger.exception("Error in preparing conversation") - raise HTTPException(status_code=status.HTTP_503_SERVICE_UNAVAILABLE, detail=str(e)) + raise HTTPException( + status_code=status.HTTP_503_SERVICE_UNAVAILABLE, detail=str(e) + ) from e response_id = f"resp_{uuid.uuid4().hex}" - created_time = int(datetime.now(tz=timezone.utc).timestamp()) + created_time = int(datetime.now(tz=UTC).timestamp()) try: assert session and client @@ -1635,7 +1548,7 @@ async def create_response( ) except Exception as e: logger.exception("Gemini API error") - raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(e)) + raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(e)) from e if request.stream: return _create_responses_real_streaming_response( @@ -1655,15 +1568,17 @@ async def create_response( ) try: - raw_t = GeminiClientWrapper.extract_output(resp_or_stream, include_thoughts=True) - raw_c = GeminiClientWrapper.extract_output(resp_or_stream, include_thoughts=False) + thoughts = resp_or_stream.thoughts + raw_clean = GeminiClientWrapper.extract_output(resp_or_stream, include_thoughts=False) except Exception as exc: logger.exception("Gemini parsing failed") raise HTTPException( status_code=status.HTTP_502_BAD_GATEWAY, detail="Malformed response." ) from exc - assistant_text, storage_output, tool_calls = _process_llm_output(raw_t, raw_c, struct_req) + thoughts, assistant_text, storage_output, tool_calls = _process_llm_output( + thoughts, raw_clean, struct_req + ) images = resp_or_stream.images or [] if ( request.tool_choice is not None and request.tool_choice.type == "image_generation" @@ -1718,8 +1633,13 @@ async def create_response( if image_markdown: storage_output += image_markdown - p_tok, c_tok, t_tok = _calculate_usage(messages, assistant_text, tool_calls) - usage = ResponseUsage(input_tokens=p_tok, output_tokens=c_tok, total_tokens=t_tok) + p_tok, c_tok, t_tok, r_tok = _calculate_usage(messages, assistant_text, tool_calls, thoughts) + usage = ResponseUsage( + input_tokens=p_tok, + output_tokens=c_tok, + total_tokens=t_tok, + output_tokens_details={"reasoning_tokens": r_tok}, + ) payload = _create_responses_standard_payload( response_id, created_time, @@ -1730,8 +1650,16 @@ async def create_response( usage, request, norm_input, + thoughts, ) _persist_conversation( - db, model.model_name, client.id, session.metadata, messages, storage_output, tool_calls + db, + model.model_name, + client.id, + session.metadata, + messages, + storage_output, + tool_calls, + thoughts, ) return payload diff --git a/app/server/health.py b/app/server/health.py index f521db1..444c938 100644 --- a/app/server/health.py +++ b/app/server/health.py @@ -1,8 +1,8 @@ from fastapi import APIRouter from loguru import logger -from ..models import HealthCheckResponse -from ..services import GeminiClientPool, LMDBConversationStore +from app.models import HealthCheckResponse +from app.services import GeminiClientPool, LMDBConversationStore router = APIRouter() diff --git a/app/server/images.py b/app/server/images.py index fe078f7..e1c161c 100644 --- a/app/server/images.py +++ b/app/server/images.py @@ -1,7 +1,7 @@ from fastapi import APIRouter, HTTPException, Query from fastapi.responses import FileResponse -from ..server.middleware import get_image_store_dir, verify_image_token +from app.server.middleware import get_image_store_dir, verify_image_token router = APIRouter() diff --git a/app/server/middleware.py b/app/server/middleware.py index 630e1f5..b5bc55b 100644 --- a/app/server/middleware.py +++ b/app/server/middleware.py @@ -6,11 +6,11 @@ from fastapi import Depends, FastAPI, HTTPException, Request, status from fastapi.middleware.cors import CORSMiddleware -from fastapi.responses import ORJSONResponse +from fastapi.responses import JSONResponse from fastapi.security import HTTPAuthorizationCredentials, HTTPBearer from loguru import logger -from ..utils import g_config +from app.utils import g_config # Persistent directory for storing generated images IMAGE_STORE_DIR = Path(tempfile.gettempdir()) / "ai_generated_images" @@ -70,12 +70,12 @@ def cleanup_expired_images(retention_days: int) -> int: def global_exception_handler(request: Request, exc: Exception): if isinstance(exc, HTTPException): - return ORJSONResponse( + return JSONResponse( status_code=exc.status_code, content={"error": {"message": exc.detail}}, ) - return ORJSONResponse( + return JSONResponse( status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, content={"error": {"message": str(exc)}}, ) diff --git a/app/services/client.py b/app/services/client.py index 70dfce9..b8f976b 100644 --- a/app/services/client.py +++ b/app/services/client.py @@ -5,9 +5,9 @@ from gemini_webapi import GeminiClient, ModelOutput from loguru import logger -from ..models import Message -from ..utils import g_config -from ..utils.helper import ( +from app.models import Message +from app.utils import g_config +from app.utils.helper import ( add_tag, normalize_llm_text, save_file_to_tempfile, @@ -33,7 +33,7 @@ async def init( timeout: float = cast(float, _UNSET), watchdog_timeout: float = cast(float, _UNSET), auto_close: bool = False, - close_delay: float = 300, + close_delay: float = cast(float, _UNSET), auto_refresh: bool = cast(bool, _UNSET), refresh_interval: float = cast(float, _UNSET), verbose: bool = cast(bool, _UNSET), @@ -44,6 +44,7 @@ async def init( config = g_config.gemini timeout = cast(float, _resolve(timeout, config.timeout)) watchdog_timeout = cast(float, _resolve(watchdog_timeout, config.watchdog_timeout)) + close_delay = timeout auto_refresh = cast(bool, _resolve(auto_refresh, config.auto_refresh)) refresh_interval = cast(float, _resolve(refresh_interval, config.refresh_interval)) verbose = cast(bool, _resolve(verbose, config.verbose)) @@ -146,9 +147,8 @@ async def process_message( model_input = "\n".join(fragment for fragment in text_fragments if fragment is not None) - if model_input or message.role == "tool": - if tagged: - model_input = add_tag(message.role, model_input) + if (model_input or message.role == "tool") and tagged: + model_input = add_tag(message.role, model_input) return model_input, files diff --git a/app/services/lmdb.py b/app/services/lmdb.py index abf8859..07f0a23 100644 --- a/app/services/lmdb.py +++ b/app/services/lmdb.py @@ -1,25 +1,24 @@ import hashlib -import re import string from contextlib import contextmanager from datetime import datetime, timedelta from pathlib import Path -from typing import Any, Dict, List, Optional +from typing import Any import lmdb import orjson from loguru import logger -from ..models import ContentItem, ConversationInStore, Message -from ..utils import g_config -from ..utils.helper import ( +from app.models import ContentItem, ConversationInStore, Message +from app.utils import g_config +from app.utils.helper import ( extract_tool_calls, normalize_llm_text, remove_tool_call_blocks, strip_system_hints, unescape_text, ) -from ..utils.singleton import Singleton +from app.utils.singleton import Singleton _VOLATILE_TRANS_TABLE = str.maketrans("", "", string.whitespace + string.punctuation) @@ -42,7 +41,6 @@ def _normalize_text(text: str | None, fuzzy: bool = False) -> str | None: text = normalize_llm_text(text) text = unescape_text(text) - text = LMDBConversationStore.remove_think_tags(text) text = remove_tool_call_blocks(text) if fuzzy: @@ -60,6 +58,9 @@ def _hash_message(message: Message, fuzzy: bool = False) -> str: "role": message.role, "name": message.name or None, "tool_call_id": message.tool_call_id or None, + "reasoning_content": _normalize_text(message.reasoning_content) + if message.reasoning_content + else None, } content = message.content @@ -125,7 +126,7 @@ def _hash_message(message: Message, fuzzy: bool = False) -> str: def _hash_conversation( - client_id: str, model: str, messages: List[Message], fuzzy: bool = False + client_id: str, model: str, messages: list[Message], fuzzy: bool = False ) -> str: """Generate a hash for a list of messages and model name, tied to a specific client_id.""" combined_hash = hashlib.sha256() @@ -145,9 +146,9 @@ class LMDBConversationStore(metaclass=Singleton): def __init__( self, - db_path: Optional[str] = None, - max_db_size: Optional[int] = None, - retention_days: Optional[int] = None, + db_path: str | None = None, + max_db_size: int | None = None, + retention_days: int | None = None, ): """ Initialize LMDB store. @@ -219,7 +220,7 @@ def _get_transaction(self, write: bool = False): raise @staticmethod - def _decode_index_value(data: bytes) -> List[str]: + def _decode_index_value(data: bytes) -> list[str]: """Decode index value, handling both legacy single-string and new list-of-strings formats.""" if not data: return [] @@ -238,7 +239,7 @@ def _decode_index_value(data: bytes) -> List[str]: @staticmethod def _update_index(txn: lmdb.Transaction, prefix: str, hash_val: str, storage_key: str): """Add a storage key to the index for a given hash, avoiding duplicates.""" - idx_key = f"{prefix}{hash_val}".encode("utf-8") + idx_key = f"{prefix}{hash_val}".encode() existing = txn.get(idx_key) keys = LMDBConversationStore._decode_index_value(existing) if existing else [] if storage_key not in keys: @@ -248,7 +249,7 @@ def _update_index(txn: lmdb.Transaction, prefix: str, hash_val: str, storage_key @staticmethod def _remove_from_index(txn: lmdb.Transaction, prefix: str, hash_val: str, storage_key: str): """Remove a specific storage key from the index for a given hash.""" - idx_key = f"{prefix}{hash_val}".encode("utf-8") + idx_key = f"{prefix}{hash_val}".encode() existing = txn.get(idx_key) if not existing: return @@ -263,7 +264,7 @@ def _remove_from_index(txn: lmdb.Transaction, prefix: str, hash_val: str, storag def store( self, conv: ConversationInStore, - custom_key: Optional[str] = None, + custom_key: str | None = None, ) -> str: """ Store a conversation model in LMDB. @@ -312,7 +313,7 @@ def store( ) raise - def get(self, key: str) -> Optional[ConversationInStore]: + def get(self, key: str) -> ConversationInStore | None: """ Retrieve conversation data by key. @@ -340,7 +341,7 @@ def get(self, key: str) -> Optional[ConversationInStore]: logger.error(f"Unexpected error retrieving messages with key {key[:12]}: {e}") return None - def find(self, model: str, messages: List[Message]) -> Optional[ConversationInStore]: + def find(self, model: str, messages: list[Message]) -> ConversationInStore | None: """ Search conversation data by message list. Tries raw matching, then sanitized matching, and finally fuzzy matching. @@ -360,12 +361,13 @@ def find(self, model: str, messages: List[Message]) -> Optional[ConversationInSt return conv cleaned_messages = self.sanitize_messages(messages) - if cleaned_messages != messages: - if conv := self._find_by_message_list(model, cleaned_messages): - logger.debug( - f"Session found for '{model}' with {len(cleaned_messages)} cleaned messages." - ) - return conv + if cleaned_messages != messages and ( + conv := self._find_by_message_list(model, cleaned_messages) + ): + logger.debug( + f"Session found for '{model}' with {len(cleaned_messages)} cleaned messages." + ) + return conv if conv := self._find_by_message_list(model, messages, fuzzy=True): logger.debug( @@ -379,9 +381,9 @@ def find(self, model: str, messages: List[Message]) -> Optional[ConversationInSt def _find_by_message_list( self, model: str, - messages: List[Message], + messages: list[Message], fuzzy: bool = False, - ) -> Optional[ConversationInStore]: + ) -> ConversationInStore | None: """ Internal find implementation based on a message list. @@ -440,7 +442,7 @@ def exists(self, key: str) -> bool: logger.error(f"Failed to check existence of key {key}: {e}") return False - def delete(self, key: str) -> Optional[ConversationInStore]: + def delete(self, key: str) -> ConversationInStore | None: """Delete conversation model by key.""" try: with self._get_transaction(write=True) as txn: @@ -466,7 +468,7 @@ def delete(self, key: str) -> Optional[ConversationInStore]: logger.error(f"Failed to delete messages with key {key[:12]}: {e}") return None - def keys(self, prefix: str = "", limit: Optional[int] = None) -> List[str]: + def keys(self, prefix: str = "", limit: int | None = None) -> list[str]: """List all keys in the store, optionally filtered by prefix.""" keys = [] try: @@ -492,7 +494,7 @@ def keys(self, prefix: str = "", limit: Optional[int] = None) -> List[str]: logger.error(f"Failed to list keys: {e}") return keys - def cleanup_expired(self, retention_days: Optional[int] = None) -> int: + def cleanup_expired(self, retention_days: int | None = None) -> int: """Delete conversations older than the given retention period.""" retention_value = ( self.retention_days if retention_days is None else max(0, int(retention_days)) @@ -561,7 +563,7 @@ def cleanup_expired(self, retention_days: Optional[int] = None) -> int: return removed - def stats(self) -> Dict[str, Any]: + def stats(self) -> dict[str, Any]: """Get database statistics.""" if not self._env: logger.error("LMDB environment not initialized") @@ -583,21 +585,23 @@ def __del__(self): """Cleanup on destruction.""" self.close() - @staticmethod - def remove_think_tags(text: str) -> str: - """Remove all ... tags and strip whitespace.""" - if not text: - return text - cleaned_content = re.sub(r".*?", "", text, flags=re.DOTALL) - return cleaned_content.strip() - @staticmethod def sanitize_messages(messages: list[Message]) -> list[Message]: """Clean all messages of internal markers, hints and normalize tool calls.""" cleaned_messages = [] for msg in messages: + update_data = {} + content_changed = False + + # Normalize reasoning_content + if msg.reasoning_content: + norm_reasoning = _normalize_text(msg.reasoning_content) + if norm_reasoning != msg.reasoning_content: + update_data["reasoning_content"] = norm_reasoning + content_changed = True + if isinstance(msg.content, str): - text = LMDBConversationStore.remove_think_tags(msg.content) + text = msg.content tool_calls = msg.tool_calls if msg.role == "assistant" and not tool_calls: @@ -607,48 +611,41 @@ def sanitize_messages(messages: list[Message]) -> list[Message]: normalized_content = text.strip() or None - if normalized_content != msg.content or tool_calls != msg.tool_calls: - cleaned_msg = msg.model_copy( - update={ - "content": normalized_content, - "tool_calls": tool_calls or None, - } - ) - cleaned_messages.append(cleaned_msg) - else: - cleaned_messages.append(msg) + if normalized_content != msg.content: + update_data["content"] = normalized_content + content_changed = True + if tool_calls != msg.tool_calls: + update_data["tool_calls"] = tool_calls or None + content_changed = True + elif isinstance(msg.content, list): new_content = [] all_extracted_calls = list(msg.tool_calls or []) - changed = False + list_changed = False for item in msg.content: if isinstance(item, ContentItem) and item.type == "text" and item.text: - text = LMDBConversationStore.remove_think_tags(item.text) + text = item.text if msg.role == "assistant" and not msg.tool_calls: text, extracted = extract_tool_calls(text) if extracted: all_extracted_calls.extend(extracted) - changed = True + list_changed = True else: text = strip_system_hints(text) if text != item.text: - changed = True + list_changed = True item = item.model_copy(update={"text": text.strip() or None}) new_content.append(item) - if changed: - cleaned_messages.append( - msg.model_copy( - update={ - "content": new_content, - "tool_calls": all_extracted_calls or None, - } - ) - ) - else: - cleaned_messages.append(msg) + if list_changed: + update_data["content"] = new_content + update_data["tool_calls"] = all_extracted_calls or None + content_changed = True + + if content_changed: + cleaned_messages.append(msg.model_copy(update=update_data)) else: cleaned_messages.append(msg) return cleaned_messages diff --git a/app/services/pool.py b/app/services/pool.py index decc21a..3c26e3d 100644 --- a/app/services/pool.py +++ b/app/services/pool.py @@ -1,11 +1,13 @@ import asyncio +import inspect from collections import deque -from typing import Dict, List, Optional +from gemini_webapi import GeminiClient from loguru import logger -from ..utils import g_config -from ..utils.singleton import Singleton +from app.utils import g_config +from app.utils.singleton import Singleton + from .client import GeminiClientWrapper @@ -13,21 +15,32 @@ class GeminiClientPool(metaclass=Singleton): """Pool of GeminiClient instances identified by unique ids.""" def __init__(self) -> None: - self._clients: List[GeminiClientWrapper] = [] - self._id_map: Dict[str, GeminiClientWrapper] = {} + self._clients: list[GeminiClientWrapper] = [] + self._id_map: dict[str, GeminiClientWrapper] = {} self._round_robin: deque[GeminiClientWrapper] = deque() - self._restart_locks: Dict[str, asyncio.Lock] = {} + self._restart_locks: dict[str, asyncio.Lock] = {} if len(g_config.gemini.clients) == 0: raise ValueError("No Gemini clients configured") for c in g_config.gemini.clients: - client = GeminiClientWrapper( - client_id=c.id, - secure_1psid=c.secure_1psid, - secure_1psidts=c.secure_1psidts, - proxy=c.proxy, - ) + kwargs = { + "client_id": c.id, + "secure_1psid": c.secure_1psid, + "secure_1psidts": c.secure_1psidts, + "proxy": c.proxy, + } + if c.cookies: + sig = inspect.signature(GeminiClient.__init__) + if "cookies" in sig.parameters: + kwargs["cookies"] = c.cookies + else: + logger.debug( + f"Ignoring 'cookies' in config for client {c.id} because " + "the current version of gemini_webapi doesn't support it." + ) + + client = GeminiClientWrapper(**kwargs) self._clients.append(client) self._id_map[c.id] = client self._round_robin.append(client) @@ -55,7 +68,7 @@ async def init(self) -> None: if success_count == 0: raise RuntimeError("Failed to initialize any Gemini clients") - async def acquire(self, client_id: Optional[str] = None) -> GeminiClientWrapper: + async def acquire(self, client_id: str | None = None) -> GeminiClientWrapper: """Return a healthy client by id or using round-robin.""" if not self._round_robin: raise RuntimeError("No Gemini clients configured") @@ -106,10 +119,10 @@ async def _ensure_client_ready(self, client: GeminiClientWrapper) -> bool: return False @property - def clients(self) -> List[GeminiClientWrapper]: + def clients(self) -> list[GeminiClientWrapper]: """Return managed clients.""" return self._clients - def status(self) -> Dict[str, bool]: + def status(self) -> dict[str, bool]: """Return running status for each client.""" return {client.id: client.running() for client in self._clients} diff --git a/app/utils/config.py b/app/utils/config.py index 4c1709f..7371623 100644 --- a/app/utils/config.py +++ b/app/utils/config.py @@ -1,7 +1,7 @@ import ast import os import sys -from typing import Any, Literal, Optional +from typing import Any, Literal import orjson from loguru import logger @@ -28,7 +28,7 @@ class ServerConfig(BaseModel): host: str = Field(default="0.0.0.0", description="Server host address") port: int = Field(default=8000, ge=1, le=65535, description="Server port number") - api_key: Optional[str] = Field( + api_key: str | None = Field( default=None, description="API key for authentication, if set, will enable API key validation", ) @@ -41,22 +41,35 @@ class GeminiClientSettings(BaseModel): id: str = Field(..., description="Unique identifier for the client") secure_1psid: str = Field(..., description="Gemini Secure 1PSID") secure_1psidts: str = Field(..., description="Gemini Secure 1PSIDTS") - proxy: Optional[str] = Field(default=None, description="Proxy URL for this Gemini client") + proxy: str | None = Field(default=None, description="Proxy URL for this Gemini client") + cookies: dict[str, str] | None = Field( + default=None, description="Optional custom cookies for this Gemini client" + ) @field_validator("proxy", mode="before") @classmethod - def _blank_proxy_to_none(cls, value: Optional[str]) -> Optional[str]: + def _blank_proxy_to_none(cls, value: str | None) -> str | None: if value is None: return None stripped = value.strip() return stripped or None + @field_validator("cookies", mode="before") + @classmethod + def _parse_cookies(cls, v: Any) -> Any: + if isinstance(v, str) and v.strip().startswith("{"): + try: + return orjson.loads(v) + except orjson.JSONDecodeError: + pass + return v + class GeminiModelConfig(BaseModel): """Configuration for a custom Gemini model.""" - model_name: Optional[str] = Field(default=None, description="Name of the model") - model_header: Optional[dict[str, Optional[str]]] = Field( + model_name: str | None = Field(default=None, description="Name of the model") + model_header: dict[str, str | None] | None = Field( default=None, description="Header for the model" ) @@ -67,7 +80,6 @@ def _parse_json_string(cls, v: Any) -> Any: try: return orjson.loads(v) except orjson.JSONDecodeError: - # Return the original value to let Pydantic handle the error or type mismatch return v return v @@ -83,13 +95,11 @@ class GeminiConfig(BaseModel): default="append", description="Strategy for loading models: 'append' merges custom with default, 'overwrite' uses only custom", ) - timeout: int = Field(default=300, ge=30, description="Init timeout in seconds") - watchdog_timeout: int = Field( - default=60, ge=10, le=75, description="Watchdog timeout in seconds (Not more than 75s)" - ) + timeout: int = Field(default=600, ge=30, description="Init timeout in seconds") + watchdog_timeout: int = Field(default=300, ge=30, description="Watchdog timeout in seconds") auto_refresh: bool = Field(True, description="Enable auto-refresh for Gemini cookies") refresh_interval: int = Field( - default=540, + default=600, ge=60, description="Interval in seconds to refresh Gemini cookies (Not less than 60s)", ) diff --git a/app/utils/helper.py b/app/utils/helper.py index 64df4f7..187f310 100644 --- a/app/utils/helper.py +++ b/app/utils/helper.py @@ -10,11 +10,11 @@ from pathlib import Path from urllib.parse import urlparse -import httpx import orjson +from curl_cffi.requests import AsyncSession from loguru import logger -from ..models import FunctionCall, Message, ToolCall +from app.models import FunctionCall, Message, ToolCall VALID_TAG_ROLES = {"user", "assistant", "system", "tool"} TOOL_WRAP_HINT = ( @@ -36,41 +36,88 @@ "CRITICAL: Do NOT mix natural language with protocol tags. Either respond naturally OR provide the protocol block alone. There is no middle ground.\n" ) TOOL_BLOCK_RE = re.compile( - r"(?:\[ToolCalls]|\\\[ToolCalls\\])\s*(.*?)\s*(?:\[/ToolCalls]|\\\[\\/ToolCalls\\])", + r"\\?\[\s*ToolCalls\s*\\?]\s*(.*?)\s*\\?\[\s*\\?/\s*ToolCalls\s*\\?]", re.DOTALL | re.IGNORECASE, ) TOOL_CALL_RE = re.compile( - r"(?:\[Call:|\\\[Call\\:)(?P(?:[^]\\]|\\.)+)(?:]|\\])\s*(?P.*?)\s*(?:\[/Call]|\\\[\\/Call\\])", + r"\\?\[\s*Call\s*\\?:\s*(?P(?:[^]\\]|\\.)+)\s*\\?]\s*(?P.*?)\s*\\?\[\s*\\?/\s*Call\s*\\?]", re.DOTALL | re.IGNORECASE, ) RESPONSE_BLOCK_RE = re.compile( - r"(?:\[ToolResults]|\\\[ToolResults\\])\s*(.*?)\s*(?:\[/ToolResults]|\\\[\\/ToolResults\\])", + r"\\?\[\s*ToolResults\s*\\?]\s*(.*?)\s*\\?\[\s*\\?/\s*ToolResults\s*\\?]", re.DOTALL | re.IGNORECASE, ) RESPONSE_ITEM_RE = re.compile( - r"(?:\[Result:|\\\[Result\\:)(?P(?:[^]\\]|\\.)+)(?:]|\\])\s*(?P.*?)\s*(?:\[/Result]|\\\[\\/Result\\])", + r"\\?\[\s*Result\s*\\?:\s*(?P(?:[^]\\]|\\.)+)\s*\\?]\s*(?P.*?)\s*\\?\[\s*\\?/\s*Result\s*\\?]", re.DOTALL | re.IGNORECASE, ) TAGGED_ARG_RE = re.compile( - r"(?:\[CallParameter:|\\\[CallParameter\\:)(?P(?:[^]\\]|\\.)+)(?:]|\\])\s*(?P.*?)\s*(?:\[/CallParameter]|\\\[\\/CallParameter\\])", + r"\\?\[\s*CallParameter\s*\\?:\s*(?P(?:[^]\\]|\\.)+)\s*\\?]\s*(?P.*?)\s*\\?\[\s*\\?/\s*CallParameter\s*\\?]", re.DOTALL | re.IGNORECASE, ) TAGGED_RESULT_RE = re.compile( - r"(?:\[ToolResult]|\\\[ToolResult\\])\s*(.*?)\s*(?:\[/ToolResult]|\\\[\\/ToolResult\\])", + r"\\?\[\s*ToolResult\s*\\?]\s*(.*?)\s*\\?\[\s*\\?/\s*ToolResult\s*\\?]", re.DOTALL | re.IGNORECASE, ) CONTROL_TOKEN_RE = re.compile( - r"<\|im_(?:start|end)\|>|\\<\\\|im\\_(?:start|end)\\\|\\>", re.IGNORECASE + r"\\?\s*<\s*\\?\|\s*im\s*\\?_(?:start|end)\s*\\?\|\s*>\s*", re.IGNORECASE ) CHATML_START_RE = re.compile( - r"(?:<\|im_start\|>|\\<\\\|im\\_start\\\|\\>)\s*(\w+)\s*\n?", re.IGNORECASE + r"\\?\s*<\s*\\?\|\s*im\s*\\?_start\s*\\?\|\s*>\s*(\w+)\s*\n?", re.IGNORECASE ) -CHATML_END_RE = re.compile(r"<\|im_end\|>|\\<\\\|im\\_end\\\|\\>", re.IGNORECASE) +CHATML_END_RE = re.compile(r"\\?\s*<\s*\\?\|\s*im\s*\\?_end\s*\\?\|\s*>\s*", re.IGNORECASE) COMMONMARK_UNESCAPE_RE = re.compile(r"\\([!\"#$%&'()*+,\-./:;<=>?@\[\\\]^_`{|}~])") +PARAM_FENCE_RE = re.compile(r"^(?P`{3,})") TOOL_HINT_STRIPPED = TOOL_WRAP_HINT.strip() _hint_lines = [line.strip() for line in TOOL_WRAP_HINT.split("\n") if line.strip()] TOOL_HINT_LINE_START = _hint_lines[0] if _hint_lines else "" TOOL_HINT_LINE_END = _hint_lines[-1] if _hint_lines else "" +TOOL_HINT_START_ESC = re.escape(TOOL_HINT_LINE_START) if TOOL_HINT_LINE_START else "" +TOOL_HINT_END_ESC = re.escape(TOOL_HINT_LINE_END) if TOOL_HINT_LINE_END else "" + +HINT_FULL_RE = ( + re.compile(rf"\n?{TOOL_HINT_START_ESC}:?.*?{TOOL_HINT_END_ESC}\n?", re.DOTALL | re.IGNORECASE) + if TOOL_HINT_START_ESC and TOOL_HINT_END_ESC + else None +) +HINT_START_RE = ( + re.compile(rf"\n?{TOOL_HINT_START_ESC}:?\s*", re.IGNORECASE) if TOOL_HINT_START_ESC else None +) +HINT_END_RE = ( + re.compile(rf"\s*{TOOL_HINT_END_ESC}\n?", re.IGNORECASE) if TOOL_HINT_END_ESC else None +) + +# --- Streaming Specific Patterns --- +_START_PATTERNS = { + "TOOL": r"\\?\[\s*ToolCalls\s*\\?\]", + "ORPHAN": r"\\?\[\s*Call\s*\\?:\s*(?:[^\]\\]|\\.)+\s*\\?\]", + "RESP": r"\\?\[\s*ToolResults\s*\\?\]", + "ARG": r"\\?\[\s*CallParameter\s*\\?:\s*(?:[^\]\\]|\\.)+\s*\\?\]", + "RESULT": r"\\?\[\s*ToolResult\s*\\?\]", + "ITEM": r"\\?\[\s*Result\s*\\?:\s*(?:[^\]\\]|\\.)+\s*\\?\]", + "TAG": r"\\?\s*<\s*\\?\|\s*im\s*\\?_start\s*\\?\|\s*>", +} + +_PROTOCOL_ENDS = ( + r"\\?\[\s*\\?/\s*(?:ToolCalls|Call|ToolResults|CallParameter|ToolResult|Result)\s*\\?\]" +) +_TAG_END = r"\\?\s*<\s*\\?\|\s*im\s*\\?_end\s*\\?\|\s*>" + +if TOOL_HINT_START_ESC and TOOL_HINT_END_ESC: + _START_PATTERNS["HINT"] = rf"\n?{TOOL_HINT_START_ESC}:?\s*" + +_master_parts = [f"(?P<{name}_START>{pattern})" for name, pattern in _START_PATTERNS.items()] +_master_parts.append(f"(?P{_PROTOCOL_ENDS})") +_master_parts.append(f"(?P{_TAG_END})") + +if TOOL_HINT_START_ESC and TOOL_HINT_END_ESC: + _master_parts.append(f"(?P{TOOL_HINT_END_ESC}\n?)") + +STREAM_MASTER_RE = re.compile("|".join(_master_parts), re.IGNORECASE) +STREAM_TAIL_RE = re.compile( + r"(?:\\|\\?\[[TCRP/]?\s*[^]]*|\\?\s*<\s*\\?\|?\s*i?\s*m?\s*\\?_?(?:s?t?a?r?t?|e?n?d?)\s*\\?\|?\s*>?|)$", + re.IGNORECASE, +) def add_tag(role: str, content: str, unclose: bool = False) -> str: @@ -113,16 +160,16 @@ def _strip_param_fences(s: str) -> str: if not s: return "" - match = re.match(r"^(?P`{3,})", s) + match = PARAM_FENCE_RE.match(s) if not match or not s.endswith(match.group("fence")): return s + fence = match.group("fence") lines = s.splitlines() - if len(lines) >= 2: + if len(lines) >= 3 and lines[-1].strip() == fence: return "\n".join(lines[1:-1]) - n = len(match.group("fence")) - return s[n:-n].strip() + return s[len(fence) : -len(fence)].strip() def estimate_tokens(text: str | None) -> int: @@ -154,7 +201,7 @@ async def save_url_to_tempfile(url: str, tempdir: Path | None = None) -> Path: data = base64.b64decode(url.split(",")[1]) suffix = mimetypes.guess_extension(mime_type) or f".{mime_type.split('/')[1]}" else: - async with httpx.AsyncClient(follow_redirects=True) as client: + async with AsyncSession(impersonate="chrome", allow_redirects=True) as client: resp = await client.get(url) resp.raise_for_status() data = resp.content @@ -212,14 +259,12 @@ def strip_system_hints(text: str) -> str: cleaned = t_unescaped.replace(TOOL_WRAP_HINT, "").replace(TOOL_HINT_STRIPPED, "") - if TOOL_HINT_LINE_START and TOOL_HINT_LINE_END: - pattern = rf"\n?{re.escape(TOOL_HINT_LINE_START)}.*?{re.escape(TOOL_HINT_LINE_END)}\.?\n?" - cleaned = re.sub(pattern, "", cleaned, flags=re.DOTALL) - - if TOOL_HINT_LINE_START: - cleaned = re.sub(rf"\n?{re.escape(TOOL_HINT_LINE_START)}:?\s*", "", cleaned) - if TOOL_HINT_LINE_END: - cleaned = re.sub(rf"\s*{re.escape(TOOL_HINT_LINE_END)}\.?\n?", "", cleaned) + if HINT_FULL_RE: + cleaned = HINT_FULL_RE.sub("", cleaned) + if HINT_START_RE: + cleaned = HINT_START_RE.sub("", cleaned) + if HINT_END_RE: + cleaned = HINT_END_RE.sub("", cleaned) cleaned = strip_tagged_blocks(cleaned) cleaned = CONTROL_TOKEN_RE.sub("", cleaned) @@ -272,7 +317,7 @@ def _create_tool_call(name: str, raw_args: str) -> None: arguments = "{}" index = len(tool_calls) - seed = f"{name}:{arguments}:{index}".encode("utf-8") + seed = f"{name}:{arguments}:{index}".encode() call_id = f"call_{hashlib.sha256(seed).hexdigest()[:24]}" tool_calls.append( diff --git a/app/utils/singleton.py b/app/utils/singleton.py index 489e87e..2a258af 100644 --- a/app/utils/singleton.py +++ b/app/utils/singleton.py @@ -1,10 +1,10 @@ -from typing import ClassVar, Dict +from typing import ClassVar class Singleton(type): - _instances: ClassVar[Dict[type, object]] = {} + _instances: ClassVar[dict[type, object]] = {} def __call__(cls, *args, **kwargs): if cls not in cls._instances: - cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs) + cls._instances[cls] = super().__call__(*args, **kwargs) return cls._instances[cls] diff --git a/config/config.yaml b/config/config.yaml index 3d5e6f4..f38ef86 100644 --- a/config/config.yaml +++ b/config/config.yaml @@ -22,10 +22,10 @@ gemini: secure_1psid: "YOUR_SECURE_1PSID_HERE" secure_1psidts: "YOUR_SECURE_1PSIDTS_HERE" proxy: null # Optional proxy URL (null/empty means direct connection) - timeout: 300 # Init timeout in seconds (Not less than 30s) - watchdog_timeout: 60 # Watchdog timeout in seconds (Not more than 75s) + timeout: 600 # Init timeout in seconds (Not less than 30s) + watchdog_timeout: 300 # Watchdog timeout in seconds (Not less than 30s) auto_refresh: true # Auto-refresh session cookies - refresh_interval: 540 # Refresh interval in seconds (Not less than 60s) + refresh_interval: 600 # Refresh interval in seconds (Not less than 60s) verbose: false # Enable verbose logging for Gemini requests max_chars_per_request: 1000000 # Maximum characters Gemini Web accepts per request. Non-pro users might have a lower limit model_strategy: "append" # Strategy: 'append' (default + custom) or 'overwrite' (custom only) diff --git a/pyproject.toml b/pyproject.toml index 0cae786..8638b03 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -3,33 +3,62 @@ name = "gemini-fastapi" version = "1.0.0" description = "FastAPI Server built on Gemini Web API" readme = "README.md" -requires-python = "==3.12.*" +requires-python = "==3.13.*" dependencies = [ - "fastapi>=0.129.0", + "curl-cffi>=0.14.0", + "fastapi>=0.134.0", "gemini-webapi>=1.19.2", + "httptools>=0.7.1", "lmdb>=1.7.5", "loguru>=0.7.3", "orjson>=3.11.7", - "pydantic-settings[yaml]>=2.12.0", - "uvicorn>=0.40.0", + "pydantic-settings[yaml]>=2.13.1", + "uvicorn>=0.41.0", "uvloop>=0.22.1; sys_platform != 'win32'", ] +[project.urls] +Repository = "https://github.com/Nativu5/Gemini-FastAPI" + [project.optional-dependencies] dev = [ - "ruff>=0.15.0", + "pytest>=9.0.2", + "ruff>=0.15.4", +] + +[dependency-groups] +dev = [ + "gemini-fastapi[dev]", ] [tool.ruff] line-length = 100 -lint.select = ["E", "F", "W", "I", "RUF"] -lint.ignore = ["E501"] +target-version = "py313" + +[tool.ruff.lint] +select = [ + "E", # pycodestyle errors + "F", # pyflakes + "W", # pycodestyle warnings + "I", # isort + "UP", # pyupgrade + "B", # flake8-bugbear + "C4", # flake8-comprehensions + "SIM", # flake8-simplify + "RUF", # ruff-specific rules + "TID", # flake8-tidy-imports +] +ignore = [ + "E501", # line too long +] + +[tool.ruff.lint.flake8-bugbear] +extend-immutable-calls = [ + "fastapi.Depends", + "fastapi.Query", + "fastapi.security.HTTPBearer", +] [tool.ruff.format] quote-style = "double" indent-style = "space" - -[dependency-groups] -dev = [ - "ruff>=0.15.1", -] diff --git a/scripts/dump_lmdb.py b/scripts/dump_lmdb.py index a331325..889af4f 100644 --- a/scripts/dump_lmdb.py +++ b/scripts/dump_lmdb.py @@ -1,6 +1,7 @@ import argparse +from collections.abc import Iterable from pathlib import Path -from typing import Any, Iterable, List +from typing import Any import lmdb import orjson @@ -14,17 +15,17 @@ def _decode_value(value: bytes) -> Any: return value.decode("utf-8", errors="replace") -def _dump_all(txn: lmdb.Transaction) -> List[dict[str, Any]]: +def _dump_all(txn: lmdb.Transaction) -> list[dict[str, Any]]: """Return all records from the database.""" - result: List[dict[str, Any]] = [] + result: list[dict[str, Any]] = [] for key, value in txn.cursor(): result.append({"key": key.decode("utf-8"), "value": _decode_value(value)}) return result -def _dump_selected(txn: lmdb.Transaction, keys: Iterable[str]) -> List[dict[str, Any]]: +def _dump_selected(txn: lmdb.Transaction, keys: Iterable[str]) -> list[dict[str, Any]]: """Return records for the provided keys.""" - result: List[dict[str, Any]] = [] + result: list[dict[str, Any]] = [] for key in keys: raw = txn.get(key.encode("utf-8")) if raw is not None: @@ -36,10 +37,7 @@ def dump_lmdb(path: Path, keys: Iterable[str] | None = None) -> None: """Print selected or all key-value pairs from the LMDB database.""" env = lmdb.open(str(path), readonly=True, lock=False) with env.begin() as txn: - if keys: - records = _dump_selected(txn, keys) - else: - records = _dump_all(txn) + records = _dump_selected(txn, keys) if keys else _dump_all(txn) env.close() print(orjson.dumps(records, option=orjson.OPT_INDENT_2).decode("utf-8")) diff --git a/uv.lock b/uv.lock index 5b687e4..8fb1f99 100644 --- a/uv.lock +++ b/uv.lock @@ -1,6 +1,6 @@ version = 1 revision = 3 -requires-python = "==3.12.*" +requires-python = "==3.13.*" [[package]] name = "annotated-doc" @@ -26,7 +26,6 @@ version = "4.12.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "idna" }, - { name = "typing-extensions" }, ] sdist = { url = "https://files.pythonhosted.org/packages/96/f0/5eb65b2bb0d09ac6776f2eb54adee6abe8228ea05b20a5ad0e4945de8aac/anyio-4.12.1.tar.gz", hash = "sha256:41cfcc3a4c85d3f05c932da7c26d0201ac36f72abd4435ba90d0464a3ffed703", size = 228685, upload-time = "2026-01-06T11:45:21.246Z" } wheels = [ @@ -35,11 +34,34 @@ wheels = [ [[package]] name = "certifi" -version = "2026.1.4" +version = "2026.2.25" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/af/2d/7bf41579a8986e348fa033a31cdd0e4121114f6bce2457e8876010b092dd/certifi-2026.2.25.tar.gz", hash = "sha256:e887ab5cee78ea814d3472169153c2d12cd43b14bd03329a39a9c6e2e80bfba7", size = 155029, upload-time = "2026-02-25T02:54:17.342Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9a/3c/c17fb3ca2d9c3acff52e30b309f538586f9f5b9c9cf454f3845fc9af4881/certifi-2026.2.25-py3-none-any.whl", hash = "sha256:027692e4402ad994f1c42e52a4997a9763c646b73e4096e4d5d6db8af1d6f0fa", size = 153684, upload-time = "2026-02-25T02:54:15.766Z" }, +] + +[[package]] +name = "cffi" +version = "2.0.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/e0/2d/a891ca51311197f6ad14a7ef42e2399f36cf2f9bd44752b3dc4eab60fdc5/certifi-2026.1.4.tar.gz", hash = "sha256:ac726dd470482006e014ad384921ed6438c457018f4b3d204aea4281258b2120", size = 154268, upload-time = "2026-01-04T02:42:41.825Z" } +dependencies = [ + { name = "pycparser", marker = "implementation_name != 'PyPy'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/eb/56/b1ba7935a17738ae8453301356628e8147c79dbb825bcbc73dc7401f9846/cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529", size = 523588, upload-time = "2025-09-08T23:24:04.541Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/e6/ad/3cc14f097111b4de0040c83a525973216457bbeeb63739ef1ed275c1c021/certifi-2026.1.4-py3-none-any.whl", hash = "sha256:9943707519e4add1115f44c2bc244f782c0249876bf51b6599fee1ffbedd685c", size = 152900, upload-time = "2026-01-04T02:42:40.15Z" }, + { url = "https://files.pythonhosted.org/packages/4b/8d/a0a47a0c9e413a658623d014e91e74a50cdd2c423f7ccfd44086ef767f90/cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb", size = 185230, upload-time = "2025-09-08T23:23:00.879Z" }, + { url = "https://files.pythonhosted.org/packages/4a/d2/a6c0296814556c68ee32009d9c2ad4f85f2707cdecfd7727951ec228005d/cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca", size = 181043, upload-time = "2025-09-08T23:23:02.231Z" }, + { url = "https://files.pythonhosted.org/packages/b0/1e/d22cc63332bd59b06481ceaac49d6c507598642e2230f201649058a7e704/cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b", size = 212446, upload-time = "2025-09-08T23:23:03.472Z" }, + { url = "https://files.pythonhosted.org/packages/a9/f5/a2c23eb03b61a0b8747f211eb716446c826ad66818ddc7810cc2cc19b3f2/cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b", size = 220101, upload-time = "2025-09-08T23:23:04.792Z" }, + { url = "https://files.pythonhosted.org/packages/f2/7f/e6647792fc5850d634695bc0e6ab4111ae88e89981d35ac269956605feba/cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2", size = 207948, upload-time = "2025-09-08T23:23:06.127Z" }, + { url = "https://files.pythonhosted.org/packages/cb/1e/a5a1bd6f1fb30f22573f76533de12a00bf274abcdc55c8edab639078abb6/cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3", size = 206422, upload-time = "2025-09-08T23:23:07.753Z" }, + { url = "https://files.pythonhosted.org/packages/98/df/0a1755e750013a2081e863e7cd37e0cdd02664372c754e5560099eb7aa44/cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26", size = 219499, upload-time = "2025-09-08T23:23:09.648Z" }, + { url = "https://files.pythonhosted.org/packages/50/e1/a969e687fcf9ea58e6e2a928ad5e2dd88cc12f6f0ab477e9971f2309b57c/cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c", size = 222928, upload-time = "2025-09-08T23:23:10.928Z" }, + { url = "https://files.pythonhosted.org/packages/36/54/0362578dd2c9e557a28ac77698ed67323ed5b9775ca9d3fe73fe191bb5d8/cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b", size = 221302, upload-time = "2025-09-08T23:23:12.42Z" }, + { url = "https://files.pythonhosted.org/packages/eb/6d/bf9bda840d5f1dfdbf0feca87fbdb64a918a69bca42cfa0ba7b137c48cb8/cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27", size = 172909, upload-time = "2025-09-08T23:23:14.32Z" }, + { url = "https://files.pythonhosted.org/packages/37/18/6519e1ee6f5a1e579e04b9ddb6f1676c17368a7aba48299c3759bbc3c8b3/cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75", size = 183402, upload-time = "2025-09-08T23:23:15.535Z" }, + { url = "https://files.pythonhosted.org/packages/cb/0e/02ceeec9a7d6ee63bb596121c2c8e9b3a9e150936f4fbef6ca1943e6137c/cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91", size = 177780, upload-time = "2025-09-08T23:23:16.761Z" }, ] [[package]] @@ -63,9 +85,32 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" }, ] +[[package]] +name = "curl-cffi" +version = "0.14.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "cffi" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/9b/c9/0067d9a25ed4592b022d4558157fcdb6e123516083700786d38091688767/curl_cffi-0.14.0.tar.gz", hash = "sha256:5ffbc82e59f05008ec08ea432f0e535418823cda44178ee518906a54f27a5f0f", size = 162633, upload-time = "2025-12-16T03:25:07.931Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/aa/f0/0f21e9688eaac85e705537b3a87a5588d0cefb2f09d83e83e0e8be93aa99/curl_cffi-0.14.0-cp39-abi3-macosx_14_0_arm64.whl", hash = "sha256:e35e89c6a69872f9749d6d5fda642ed4fc159619329e99d577d0104c9aad5893", size = 3087277, upload-time = "2025-12-16T03:24:49.607Z" }, + { url = "https://files.pythonhosted.org/packages/ba/a3/0419bd48fce5b145cb6a2344c6ac17efa588f5b0061f212c88e0723da026/curl_cffi-0.14.0-cp39-abi3-macosx_15_0_x86_64.whl", hash = "sha256:5945478cd28ad7dfb5c54473bcfb6743ee1d66554d57951fdf8fc0e7d8cf4e45", size = 5804650, upload-time = "2025-12-16T03:24:51.518Z" }, + { url = "https://files.pythonhosted.org/packages/e2/07/a238dd062b7841b8caa2fa8a359eb997147ff3161288f0dd46654d898b4d/curl_cffi-0.14.0-cp39-abi3-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c42e8fa3c667db9ccd2e696ee47adcd3cd5b0838d7282f3fc45f6c0ef3cfdfa7", size = 8231918, upload-time = "2025-12-16T03:24:52.862Z" }, + { url = "https://files.pythonhosted.org/packages/7c/d2/ce907c9b37b5caf76ac08db40cc4ce3d9f94c5500db68a195af3513eacbc/curl_cffi-0.14.0-cp39-abi3-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:060fe2c99c41d3cb7f894de318ddf4b0301b08dca70453d769bd4e74b36b8483", size = 8654624, upload-time = "2025-12-16T03:24:54.579Z" }, + { url = "https://files.pythonhosted.org/packages/f2/ae/6256995b18c75e6ef76b30753a5109e786813aa79088b27c8eabb1ef85c9/curl_cffi-0.14.0-cp39-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:b158c41a25388690dd0d40b5bc38d1e0f512135f17fdb8029868cbc1993d2e5b", size = 8010654, upload-time = "2025-12-16T03:24:56.507Z" }, + { url = "https://files.pythonhosted.org/packages/fb/10/ff64249e516b103cb762e0a9dca3ee0f04cf25e2a1d5d9838e0f1273d071/curl_cffi-0.14.0-cp39-abi3-manylinux_2_28_i686.whl", hash = "sha256:1439fbef3500fb723333c826adf0efb0e2e5065a703fb5eccce637a2250db34a", size = 7781969, upload-time = "2025-12-16T03:24:57.885Z" }, + { url = "https://files.pythonhosted.org/packages/51/76/d6f7bb76c2d12811aa7ff16f5e17b678abdd1b357b9a8ac56310ceccabd5/curl_cffi-0.14.0-cp39-abi3-manylinux_2_34_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e7176f2c2d22b542e3cf261072a81deb018cfa7688930f95dddef215caddb469", size = 7969133, upload-time = "2025-12-16T03:24:59.261Z" }, + { url = "https://files.pythonhosted.org/packages/23/7c/cca39c0ed4e1772613d3cba13091c0e9d3b89365e84b9bf9838259a3cd8f/curl_cffi-0.14.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:03f21ade2d72978c2bb8670e9b6de5260e2755092b02d94b70b906813662998d", size = 9080167, upload-time = "2025-12-16T03:25:00.946Z" }, + { url = "https://files.pythonhosted.org/packages/75/03/a942d7119d3e8911094d157598ae0169b1c6ca1bd3f27d7991b279bcc45b/curl_cffi-0.14.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:58ebf02de64ee5c95613209ddacb014c2d2f86298d7080c0a1c12ed876ee0690", size = 9520464, upload-time = "2025-12-16T03:25:02.922Z" }, + { url = "https://files.pythonhosted.org/packages/a2/77/78900e9b0833066d2274bda75cba426fdb4cef7fbf6a4f6a6ca447607bec/curl_cffi-0.14.0-cp39-abi3-win_amd64.whl", hash = "sha256:6e503f9a103f6ae7acfb3890c843b53ec030785a22ae7682a22cc43afb94123e", size = 1677416, upload-time = "2025-12-16T03:25:04.902Z" }, + { url = "https://files.pythonhosted.org/packages/5c/7c/d2ba86b0b3e1e2830bd94163d047de122c69a8df03c5c7c36326c456ad82/curl_cffi-0.14.0-cp39-abi3-win_arm64.whl", hash = "sha256:2eed50a969201605c863c4c31269dfc3e0da52916086ac54553cfa353022425c", size = 1425067, upload-time = "2025-12-16T03:25:06.454Z" }, +] + [[package]] name = "fastapi" -version = "0.129.0" +version = "0.134.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "annotated-doc" }, @@ -74,9 +119,9 @@ dependencies = [ { name = "typing-extensions" }, { name = "typing-inspection" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/48/47/75f6bea02e797abff1bca968d5997793898032d9923c1935ae2efdece642/fastapi-0.129.0.tar.gz", hash = "sha256:61315cebd2e65df5f97ec298c888f9de30430dd0612d59d6480beafbc10655af", size = 375450, upload-time = "2026-02-12T13:54:52.541Z" } +sdist = { url = "https://files.pythonhosted.org/packages/96/15/647ea81cb73b55b48fb095158a9cd64e42e9e4f1d34dbb5cc4a4939779d6/fastapi-0.134.0.tar.gz", hash = "sha256:3122b1ea0dbeaab48b5976e80b99ca7eda02be154bf03e126a33220e73255a9a", size = 385667, upload-time = "2026-02-27T21:18:12.931Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/9e/dd/d0ee25348ac58245ee9f90b6f3cbb666bf01f69be7e0911f9851bddbda16/fastapi-0.129.0-py3-none-any.whl", hash = "sha256:b4946880e48f462692b31c083be0432275cbfb6e2274566b1be91479cc1a84ec", size = 102950, upload-time = "2026-02-12T13:54:54.528Z" }, + { url = "https://files.pythonhosted.org/packages/e3/e6/fd49c28a54b7d6f5c64045155e40f6cff9ed4920055043fb5ac7969f7f2f/fastapi-0.134.0-py3-none-any.whl", hash = "sha256:f4e7214f24b2262258492e05c48cf21125e4ffc427e30dd32fb4f74049a3d56a", size = 110404, upload-time = "2026-02-27T21:18:10.809Z" }, ] [[package]] @@ -84,8 +129,10 @@ name = "gemini-fastapi" version = "1.0.0" source = { virtual = "." } dependencies = [ + { name = "curl-cffi" }, { name = "fastapi" }, { name = "gemini-webapi" }, + { name = "httptools" }, { name = "lmdb" }, { name = "loguru" }, { name = "orjson" }, @@ -96,30 +143,34 @@ dependencies = [ [package.optional-dependencies] dev = [ + { name = "pytest" }, { name = "ruff" }, ] [package.dev-dependencies] dev = [ - { name = "ruff" }, + { name = "gemini-fastapi", extra = ["dev"] }, ] [package.metadata] requires-dist = [ - { name = "fastapi", specifier = ">=0.129.0" }, + { name = "curl-cffi", specifier = ">=0.14.0" }, + { name = "fastapi", specifier = ">=0.133.1" }, { name = "gemini-webapi", specifier = ">=1.19.2" }, + { name = "httptools", specifier = ">=0.7.1" }, { name = "lmdb", specifier = ">=1.7.5" }, { name = "loguru", specifier = ">=0.7.3" }, { name = "orjson", specifier = ">=3.11.7" }, - { name = "pydantic-settings", extras = ["yaml"], specifier = ">=2.12.0" }, - { name = "ruff", marker = "extra == 'dev'", specifier = ">=0.15.0" }, - { name = "uvicorn", specifier = ">=0.40.0" }, + { name = "pydantic-settings", extras = ["yaml"], specifier = ">=2.13.1" }, + { name = "pytest", marker = "extra == 'dev'", specifier = ">=9.0.2" }, + { name = "ruff", marker = "extra == 'dev'", specifier = ">=0.15.4" }, + { name = "uvicorn", specifier = ">=0.41.0" }, { name = "uvloop", marker = "sys_platform != 'win32'", specifier = ">=0.22.1" }, ] provides-extras = ["dev"] [package.metadata.requires-dev] -dev = [{ name = "ruff", specifier = ">=0.15.1" }] +dev = [{ name = "gemini-fastapi", extras = ["dev"] }] [[package]] name = "gemini-webapi" @@ -180,6 +231,21 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload-time = "2025-04-24T22:06:20.566Z" }, ] +[[package]] +name = "httptools" +version = "0.7.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b5/46/120a669232c7bdedb9d52d4aeae7e6c7dfe151e99dc70802e2fc7a5e1993/httptools-0.7.1.tar.gz", hash = "sha256:abd72556974f8e7c74a259655924a717a2365b236c882c3f6f8a45fe94703ac9", size = 258961, upload-time = "2025-10-10T03:55:08.559Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/09/8f/c77b1fcbfd262d422f12da02feb0d218fa228d52485b77b953832105bb90/httptools-0.7.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:6babce6cfa2a99545c60bfef8bee0cc0545413cb0018f617c8059a30ad985de3", size = 202889, upload-time = "2025-10-10T03:54:47.089Z" }, + { url = "https://files.pythonhosted.org/packages/0a/1a/22887f53602feaa066354867bc49a68fc295c2293433177ee90870a7d517/httptools-0.7.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:601b7628de7504077dd3dcb3791c6b8694bbd967148a6d1f01806509254fb1ca", size = 108180, upload-time = "2025-10-10T03:54:48.052Z" }, + { url = "https://files.pythonhosted.org/packages/32/6a/6aaa91937f0010d288d3d124ca2946d48d60c3a5ee7ca62afe870e3ea011/httptools-0.7.1-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:04c6c0e6c5fb0739c5b8a9eb046d298650a0ff38cf42537fc372b28dc7e4472c", size = 478596, upload-time = "2025-10-10T03:54:48.919Z" }, + { url = "https://files.pythonhosted.org/packages/6d/70/023d7ce117993107be88d2cbca566a7c1323ccbaf0af7eabf2064fe356f6/httptools-0.7.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:69d4f9705c405ae3ee83d6a12283dc9feba8cc6aaec671b412917e644ab4fa66", size = 473268, upload-time = "2025-10-10T03:54:49.993Z" }, + { url = "https://files.pythonhosted.org/packages/32/4d/9dd616c38da088e3f436e9a616e1d0cc66544b8cdac405cc4e81c8679fc7/httptools-0.7.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:44c8f4347d4b31269c8a9205d8a5ee2df5322b09bbbd30f8f862185bb6b05346", size = 455517, upload-time = "2025-10-10T03:54:51.066Z" }, + { url = "https://files.pythonhosted.org/packages/1d/3a/a6c595c310b7df958e739aae88724e24f9246a514d909547778d776799be/httptools-0.7.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:465275d76db4d554918aba40bf1cbebe324670f3dfc979eaffaa5d108e2ed650", size = 458337, upload-time = "2025-10-10T03:54:52.196Z" }, + { url = "https://files.pythonhosted.org/packages/fd/82/88e8d6d2c51edc1cc391b6e044c6c435b6aebe97b1abc33db1b0b24cd582/httptools-0.7.1-cp313-cp313-win_amd64.whl", hash = "sha256:322d00c2068d125bd570f7bf78b2d367dad02b919d8581d7476d8b75b294e3e6", size = 85743, upload-time = "2025-10-10T03:54:53.448Z" }, +] + [[package]] name = "httpx" version = "0.28.1" @@ -218,18 +284,27 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/0e/61/66938bbb5fc52dbdf84594873d5b51fb1f7c7794e9c0f5bd885f30bc507b/idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea", size = 71008, upload-time = "2025-10-12T14:55:18.883Z" }, ] +[[package]] +name = "iniconfig" +version = "2.3.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/72/34/14ca021ce8e5dfedc35312d08ba8bf51fdd999c576889fc2c24cb97f4f10/iniconfig-2.3.0.tar.gz", hash = "sha256:c76315c77db068650d49c5b56314774a7804df16fee4402c1f19d6d15d8c4730", size = 20503, upload-time = "2025-10-18T21:55:43.219Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/cb/b1/3846dd7f199d53cb17f49cba7e651e9ce294d8497c8c150530ed11865bb8/iniconfig-2.3.0-py3-none-any.whl", hash = "sha256:f631c04d2c48c52b84d0d0549c99ff3859c98df65b3101406327ecc7d53fbf12", size = 7484, upload-time = "2025-10-18T21:55:41.639Z" }, +] + [[package]] name = "lmdb" version = "1.7.5" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/c7/a3/3756f2c6adba4a1413dba55e6c81a20b38a868656517308533e33cb59e1c/lmdb-1.7.5.tar.gz", hash = "sha256:f0604751762cb097059d5412444c4057b95f386c7ed958363cf63f453e5108da", size = 883490, upload-time = "2025-10-15T03:39:44.038Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/34/b4/8b862c4d7fd6f68cb33e2a919169fda8924121dc5ff61e3cc105304a6dd4/lmdb-1.7.5-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b48c2359eea876d7b634b49f84019ecc8c1626da97c795fc7b39a793676815df", size = 100910, upload-time = "2025-10-15T03:39:00.727Z" }, - { url = "https://files.pythonhosted.org/packages/27/64/8ab5da48180d5f13a293ea00a9f8758b1bee080e76ea0ab0d6be0d51b55f/lmdb-1.7.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:2f84793baeb430ba984eb6c1b4e08c0a508b1c03e79ce79fcda0f29ecc06a95a", size = 99376, upload-time = "2025-10-15T03:39:01.791Z" }, - { url = "https://files.pythonhosted.org/packages/43/e0/51bc942fe5ed3fce69c631b54f52d97785de3d94487376139be6de1e199a/lmdb-1.7.5-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:68cc21314a33faac1b749645a976b7655e7fa7cc104a72365d2429d2db7f6342", size = 298556, upload-time = "2025-10-15T03:39:02.787Z" }, - { url = "https://files.pythonhosted.org/packages/66/c5/19ea75c88b91d12da5c6f4bbe2aca633047b6b270fd613d557583d32cc5c/lmdb-1.7.5-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f2d9b7e102fcfe5e0cfb3acdebd403eb55ccbe5f7202d8f49d60bdafb1546d1e", size = 299449, upload-time = "2025-10-15T03:39:03.903Z" }, - { url = "https://files.pythonhosted.org/packages/1b/74/365194203dbff47d3a1621366d6a1133cdcce261f4ac0e1d0496f01e6ace/lmdb-1.7.5-cp312-cp312-win_amd64.whl", hash = "sha256:69de89cc79e03e191fc6f95797f1bef91b45c415d1ea9d38872b00b2d989a50f", size = 99328, upload-time = "2025-10-15T03:39:04.949Z" }, - { url = "https://files.pythonhosted.org/packages/3f/3a/a441afebff5bd761f7f58d194fed7ac265279964957479a5c8a51c42f9ad/lmdb-1.7.5-cp312-cp312-win_arm64.whl", hash = "sha256:0c880ee4b309e900f2d58a710701f5e6316a351878588c6a95a9c0bcb640680b", size = 94191, upload-time = "2025-10-15T03:39:05.975Z" }, + { url = "https://files.pythonhosted.org/packages/38/f8/03275084218eacdbdf7e185d693e1db4cb79c35d18fac47fa0d388522a0d/lmdb-1.7.5-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:66ae02fa6179e46bb69fe446b7e956afe8706ae17ec1d4cd9f7056e161019156", size = 101508, upload-time = "2025-10-15T03:39:07.228Z" }, + { url = "https://files.pythonhosted.org/packages/20/b9/bc33ae2e4940359ba2fc412e6a755a2f126bc5062b4aaf35edd3a791f9a5/lmdb-1.7.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:bf65c573311ac8330c7908257f76b28ae3576020123400a81a6b650990dc028c", size = 100105, upload-time = "2025-10-15T03:39:08.491Z" }, + { url = "https://files.pythonhosted.org/packages/fa/f6/22f84b776a64d3992f052ecb637c35f1764a39df4f2190ecc5a3a1295bd7/lmdb-1.7.5-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:97bcb3fc12841a8828db918e494fe0fd016a73d2680ad830d75719bb3bf4e76a", size = 301500, upload-time = "2025-10-15T03:39:09.463Z" }, + { url = "https://files.pythonhosted.org/packages/2a/4d/8e6be8d7d5a30d47fa0ce4b55e3a8050ad689556e6e979d206b4ac67b733/lmdb-1.7.5-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:865f374f6206ab4aacb92ffb1dc612ee1a31a421db7c89733abe06b81ac87cb0", size = 302285, upload-time = "2025-10-15T03:39:10.856Z" }, + { url = "https://files.pythonhosted.org/packages/5e/dc/7e04fb31a8f88951db81ac677e3ccb3e09248eda40e6ad52f74fd9370c32/lmdb-1.7.5-cp313-cp313-win_amd64.whl", hash = "sha256:82a04d5ca2a6a799c8db7f209354c48aebb49ff338530f5813721fc4c68e4450", size = 99447, upload-time = "2025-10-15T03:39:12.151Z" }, + { url = "https://files.pythonhosted.org/packages/5b/50/e3f97efab17b3fad4afde99b3c957ecac4ffbefada6874a57ad0c695660a/lmdb-1.7.5-cp313-cp313-win_arm64.whl", hash = "sha256:0ad85a15acbfe8a42fdef92ee5e869610286d38507e976755f211be0fc905ca7", size = 94145, upload-time = "2025-10-15T03:39:13.461Z" }, { url = "https://files.pythonhosted.org/packages/bd/2c/982cb5afed533d0cb8038232b40c19b5b85a2d887dec74dfd39e8351ef4b/lmdb-1.7.5-py3-none-any.whl", hash = "sha256:fc344bb8bc0786c87c4ccb19b31f09a38c08bd159ada6f037d669426fea06f03", size = 148539, upload-time = "2025-10-15T03:39:42.982Z" }, ] @@ -252,21 +327,48 @@ version = "3.11.7" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/53/45/b268004f745ede84e5798b48ee12b05129d19235d0e15267aa57dcdb400b/orjson-3.11.7.tar.gz", hash = "sha256:9b1a67243945819ce55d24a30b59d6a168e86220452d2c96f4d1f093e71c0c49", size = 6144992, upload-time = "2026-02-02T15:38:49.29Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/80/bf/76f4f1665f6983385938f0e2a5d7efa12a58171b8456c252f3bae8a4cf75/orjson-3.11.7-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:bd03ea7606833655048dab1a00734a2875e3e86c276e1d772b2a02556f0d895f", size = 228545, upload-time = "2026-02-02T15:37:46.376Z" }, - { url = "https://files.pythonhosted.org/packages/79/53/6c72c002cb13b5a978a068add59b25a8bdf2800ac1c9c8ecdb26d6d97064/orjson-3.11.7-cp312-cp312-macosx_15_0_arm64.whl", hash = "sha256:89e440ebc74ce8ab5c7bc4ce6757b4a6b1041becb127df818f6997b5c71aa60b", size = 125224, upload-time = "2026-02-02T15:37:47.697Z" }, - { url = "https://files.pythonhosted.org/packages/2c/83/10e48852865e5dd151bdfe652c06f7da484578ed02c5fca938e3632cb0b8/orjson-3.11.7-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5ede977b5fe5ac91b1dffc0a517ca4542d2ec8a6a4ff7b2652d94f640796342a", size = 128154, upload-time = "2026-02-02T15:37:48.954Z" }, - { url = "https://files.pythonhosted.org/packages/6e/52/a66e22a2b9abaa374b4a081d410edab6d1e30024707b87eab7c734afe28d/orjson-3.11.7-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b7b1dae39230a393df353827c855a5f176271c23434cfd2db74e0e424e693e10", size = 123548, upload-time = "2026-02-02T15:37:50.187Z" }, - { url = "https://files.pythonhosted.org/packages/de/38/605d371417021359f4910c496f764c48ceb8997605f8c25bf1dfe58c0ebe/orjson-3.11.7-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ed46f17096e28fb28d2975834836a639af7278aa87c84f68ab08fbe5b8bd75fa", size = 129000, upload-time = "2026-02-02T15:37:51.426Z" }, - { url = "https://files.pythonhosted.org/packages/44/98/af32e842b0ffd2335c89714d48ca4e3917b42f5d6ee5537832e069a4b3ac/orjson-3.11.7-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3726be79e36e526e3d9c1aceaadbfb4a04ee80a72ab47b3f3c17fefb9812e7b8", size = 141686, upload-time = "2026-02-02T15:37:52.607Z" }, - { url = "https://files.pythonhosted.org/packages/96/0b/fc793858dfa54be6feee940c1463370ece34b3c39c1ca0aa3845f5ba9892/orjson-3.11.7-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0724e265bc548af1dedebd9cb3d24b4e1c1e685a343be43e87ba922a5c5fff2f", size = 130812, upload-time = "2026-02-02T15:37:53.944Z" }, - { url = "https://files.pythonhosted.org/packages/dc/91/98a52415059db3f374757d0b7f0f16e3b5cd5976c90d1c2b56acaea039e6/orjson-3.11.7-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e7745312efa9e11c17fbd3cb3097262d079da26930ae9ae7ba28fb738367cbad", size = 133440, upload-time = "2026-02-02T15:37:55.615Z" }, - { url = "https://files.pythonhosted.org/packages/dc/b6/cb540117bda61791f46381f8c26c8f93e802892830a6055748d3bb1925ab/orjson-3.11.7-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:f904c24bdeabd4298f7a977ef14ca2a022ca921ed670b92ecd16ab6f3d01f867", size = 138386, upload-time = "2026-02-02T15:37:56.814Z" }, - { url = "https://files.pythonhosted.org/packages/63/1a/50a3201c334a7f17c231eee5f841342190723794e3b06293f26e7cf87d31/orjson-3.11.7-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:b9fc4d0f81f394689e0814617aadc4f2ea0e8025f38c226cbf22d3b5ddbf025d", size = 408853, upload-time = "2026-02-02T15:37:58.291Z" }, - { url = "https://files.pythonhosted.org/packages/87/cd/8de1c67d0be44fdc22701e5989c0d015a2adf391498ad42c4dc589cd3013/orjson-3.11.7-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:849e38203e5be40b776ed2718e587faf204d184fc9a008ae441f9442320c0cab", size = 144130, upload-time = "2026-02-02T15:38:00.163Z" }, - { url = "https://files.pythonhosted.org/packages/0f/fe/d605d700c35dd55f51710d159fc54516a280923cd1b7e47508982fbb387d/orjson-3.11.7-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:4682d1db3bcebd2b64757e0ddf9e87ae5f00d29d16c5cdf3a62f561d08cc3dd2", size = 134818, upload-time = "2026-02-02T15:38:01.507Z" }, - { url = "https://files.pythonhosted.org/packages/e4/e4/15ecc67edb3ddb3e2f46ae04475f2d294e8b60c1825fbe28a428b93b3fbd/orjson-3.11.7-cp312-cp312-win32.whl", hash = "sha256:f4f7c956b5215d949a1f65334cf9d7612dde38f20a95f2315deef167def91a6f", size = 127923, upload-time = "2026-02-02T15:38:02.75Z" }, - { url = "https://files.pythonhosted.org/packages/34/70/2e0855361f76198a3965273048c8e50a9695d88cd75811a5b46444895845/orjson-3.11.7-cp312-cp312-win_amd64.whl", hash = "sha256:bf742e149121dc5648ba0a08ea0871e87b660467ef168a3a5e53bc1fbd64bb74", size = 125007, upload-time = "2026-02-02T15:38:04.032Z" }, - { url = "https://files.pythonhosted.org/packages/68/40/c2051bd19fc467610fed469dc29e43ac65891571138f476834ca192bc290/orjson-3.11.7-cp312-cp312-win_arm64.whl", hash = "sha256:26c3b9132f783b7d7903bf1efb095fed8d4a3a85ec0d334ee8beff3d7a4749d5", size = 126089, upload-time = "2026-02-02T15:38:05.297Z" }, + { url = "https://files.pythonhosted.org/packages/89/25/6e0e52cac5aab51d7b6dcd257e855e1dec1c2060f6b28566c509b4665f62/orjson-3.11.7-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:1d98b30cc1313d52d4af17d9c3d307b08389752ec5f2e5febdfada70b0f8c733", size = 228390, upload-time = "2026-02-02T15:38:06.8Z" }, + { url = "https://files.pythonhosted.org/packages/a5/29/a77f48d2fc8a05bbc529e5ff481fb43d914f9e383ea2469d4f3d51df3d00/orjson-3.11.7-cp313-cp313-macosx_15_0_arm64.whl", hash = "sha256:d897e81f8d0cbd2abb82226d1860ad2e1ab3ff16d7b08c96ca00df9d45409ef4", size = 125189, upload-time = "2026-02-02T15:38:08.181Z" }, + { url = "https://files.pythonhosted.org/packages/89/25/0a16e0729a0e6a1504f9d1a13cdd365f030068aab64cec6958396b9969d7/orjson-3.11.7-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:814be4b49b228cfc0b3c565acf642dd7d13538f966e3ccde61f4f55be3e20785", size = 128106, upload-time = "2026-02-02T15:38:09.41Z" }, + { url = "https://files.pythonhosted.org/packages/66/da/a2e505469d60666a05ab373f1a6322eb671cb2ba3a0ccfc7d4bc97196787/orjson-3.11.7-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d06e5c5fed5caedd2e540d62e5b1c25e8c82431b9e577c33537e5fa4aa909539", size = 123363, upload-time = "2026-02-02T15:38:10.73Z" }, + { url = "https://files.pythonhosted.org/packages/23/bf/ed73f88396ea35c71b38961734ea4a4746f7ca0768bf28fd551d37e48dd0/orjson-3.11.7-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:31c80ce534ac4ea3739c5ee751270646cbc46e45aea7576a38ffec040b4029a1", size = 129007, upload-time = "2026-02-02T15:38:12.138Z" }, + { url = "https://files.pythonhosted.org/packages/73/3c/b05d80716f0225fc9008fbf8ab22841dcc268a626aa550561743714ce3bf/orjson-3.11.7-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f50979824bde13d32b4320eedd513431c921102796d86be3eee0b58e58a3ecd1", size = 141667, upload-time = "2026-02-02T15:38:13.398Z" }, + { url = "https://files.pythonhosted.org/packages/61/e8/0be9b0addd9bf86abfc938e97441dcd0375d494594b1c8ad10fe57479617/orjson-3.11.7-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9e54f3808e2b6b945078c41aa8d9b5834b28c50843846e97807e5adb75fa9705", size = 130832, upload-time = "2026-02-02T15:38:14.698Z" }, + { url = "https://files.pythonhosted.org/packages/c9/ec/c68e3b9021a31d9ec15a94931db1410136af862955854ed5dd7e7e4f5bff/orjson-3.11.7-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a12b80df61aab7b98b490fe9e4879925ba666fccdfcd175252ce4d9035865ace", size = 133373, upload-time = "2026-02-02T15:38:16.109Z" }, + { url = "https://files.pythonhosted.org/packages/d2/45/f3466739aaafa570cc8e77c6dbb853c48bf56e3b43738020e2661e08b0ac/orjson-3.11.7-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:996b65230271f1a97026fd0e6a753f51fbc0c335d2ad0c6201f711b0da32693b", size = 138307, upload-time = "2026-02-02T15:38:17.453Z" }, + { url = "https://files.pythonhosted.org/packages/e1/84/9f7f02288da1ffb31405c1be07657afd1eecbcb4b64ee2817b6fe0f785fa/orjson-3.11.7-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:ab49d4b2a6a1d415ddb9f37a21e02e0d5dbfe10b7870b21bf779fc21e9156157", size = 408695, upload-time = "2026-02-02T15:38:18.831Z" }, + { url = "https://files.pythonhosted.org/packages/18/07/9dd2f0c0104f1a0295ffbe912bc8d63307a539b900dd9e2c48ef7810d971/orjson-3.11.7-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:390a1dce0c055ddf8adb6aa94a73b45a4a7d7177b5c584b8d1c1947f2ba60fb3", size = 144099, upload-time = "2026-02-02T15:38:20.28Z" }, + { url = "https://files.pythonhosted.org/packages/a5/66/857a8e4a3292e1f7b1b202883bcdeb43a91566cf59a93f97c53b44bd6801/orjson-3.11.7-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:1eb80451a9c351a71dfaf5b7ccc13ad065405217726b59fdbeadbcc544f9d223", size = 134806, upload-time = "2026-02-02T15:38:22.186Z" }, + { url = "https://files.pythonhosted.org/packages/0a/5b/6ebcf3defc1aab3a338ca777214966851e92efb1f30dc7fc8285216e6d1b/orjson-3.11.7-cp313-cp313-win32.whl", hash = "sha256:7477aa6a6ec6139c5cb1cc7b214643592169a5494d200397c7fc95d740d5fcf3", size = 127914, upload-time = "2026-02-02T15:38:23.511Z" }, + { url = "https://files.pythonhosted.org/packages/00/04/c6f72daca5092e3117840a1b1e88dfc809cc1470cf0734890d0366b684a1/orjson-3.11.7-cp313-cp313-win_amd64.whl", hash = "sha256:b9f95dcdea9d4f805daa9ddf02617a89e484c6985fa03055459f90e87d7a0757", size = 124986, upload-time = "2026-02-02T15:38:24.836Z" }, + { url = "https://files.pythonhosted.org/packages/03/ba/077a0f6f1085d6b806937246860fafbd5b17f3919c70ee3f3d8d9c713f38/orjson-3.11.7-cp313-cp313-win_arm64.whl", hash = "sha256:800988273a014a0541483dc81021247d7eacb0c845a9d1a34a422bc718f41539", size = 126045, upload-time = "2026-02-02T15:38:26.216Z" }, +] + +[[package]] +name = "packaging" +version = "26.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/65/ee/299d360cdc32edc7d2cf530f3accf79c4fca01e96ffc950d8a52213bd8e4/packaging-26.0.tar.gz", hash = "sha256:00243ae351a257117b6a241061796684b084ed1c516a08c48a3f7e147a9d80b4", size = 143416, upload-time = "2026-01-21T20:50:39.064Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b7/b9/c538f279a4e237a006a2c98387d081e9eb060d203d8ed34467cc0f0b9b53/packaging-26.0-py3-none-any.whl", hash = "sha256:b36f1fef9334a5588b4166f8bcd26a14e521f2b55e6b9de3aaa80d3ff7a37529", size = 74366, upload-time = "2026-01-21T20:50:37.788Z" }, +] + +[[package]] +name = "pluggy" +version = "1.6.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" }, +] + +[[package]] +name = "pycparser" +version = "3.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/1b/7d/92392ff7815c21062bea51aa7b87d45576f649f16458d78b7cf94b9ab2e6/pycparser-3.0.tar.gz", hash = "sha256:600f49d217304a5902ac3c37e1281c9fe94e4d0489de643a9504c5cdfdfc6b29", size = 103492, upload-time = "2026-01-21T14:26:51.89Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0c/c3/44f3fbbfa403ea2a7c779186dc20772604442dde72947e7d01069cbe98e3/pycparser-3.0-py3-none-any.whl", hash = "sha256:b727414169a36b7d524c1c3e31839a521725078d7b2ff038656844266160a992", size = 48172, upload-time = "2026-01-21T14:26:50.693Z" }, ] [[package]] @@ -293,38 +395,34 @@ dependencies = [ ] sdist = { url = "https://files.pythonhosted.org/packages/71/70/23b021c950c2addd24ec408e9ab05d59b035b39d97cdc1130e1bce647bb6/pydantic_core-2.41.5.tar.gz", hash = "sha256:08daa51ea16ad373ffd5e7606252cc32f07bc72b28284b6bc9c6df804816476e", size = 460952, upload-time = "2025-11-04T13:43:49.098Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/5f/5d/5f6c63eebb5afee93bcaae4ce9a898f3373ca23df3ccaef086d0233a35a7/pydantic_core-2.41.5-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:f41a7489d32336dbf2199c8c0a215390a751c5b014c2c1c5366e817202e9cdf7", size = 2110990, upload-time = "2025-11-04T13:39:58.079Z" }, - { url = "https://files.pythonhosted.org/packages/aa/32/9c2e8ccb57c01111e0fd091f236c7b371c1bccea0fa85247ac55b1e2b6b6/pydantic_core-2.41.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:070259a8818988b9a84a449a2a7337c7f430a22acc0859c6b110aa7212a6d9c0", size = 1896003, upload-time = "2025-11-04T13:39:59.956Z" }, - { url = "https://files.pythonhosted.org/packages/68/b8/a01b53cb0e59139fbc9e4fda3e9724ede8de279097179be4ff31f1abb65a/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e96cea19e34778f8d59fe40775a7a574d95816eb150850a85a7a4c8f4b94ac69", size = 1919200, upload-time = "2025-11-04T13:40:02.241Z" }, - { url = "https://files.pythonhosted.org/packages/38/de/8c36b5198a29bdaade07b5985e80a233a5ac27137846f3bc2d3b40a47360/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed2e99c456e3fadd05c991f8f437ef902e00eedf34320ba2b0842bd1c3ca3a75", size = 2052578, upload-time = "2025-11-04T13:40:04.401Z" }, - { url = "https://files.pythonhosted.org/packages/00/b5/0e8e4b5b081eac6cb3dbb7e60a65907549a1ce035a724368c330112adfdd/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:65840751b72fbfd82c3c640cff9284545342a4f1eb1586ad0636955b261b0b05", size = 2208504, upload-time = "2025-11-04T13:40:06.072Z" }, - { url = "https://files.pythonhosted.org/packages/77/56/87a61aad59c7c5b9dc8caad5a41a5545cba3810c3e828708b3d7404f6cef/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e536c98a7626a98feb2d3eaf75944ef6f3dbee447e1f841eae16f2f0a72d8ddc", size = 2335816, upload-time = "2025-11-04T13:40:07.835Z" }, - { url = "https://files.pythonhosted.org/packages/0d/76/941cc9f73529988688a665a5c0ecff1112b3d95ab48f81db5f7606f522d3/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eceb81a8d74f9267ef4081e246ffd6d129da5d87e37a77c9bde550cb04870c1c", size = 2075366, upload-time = "2025-11-04T13:40:09.804Z" }, - { url = "https://files.pythonhosted.org/packages/d3/43/ebef01f69baa07a482844faaa0a591bad1ef129253ffd0cdaa9d8a7f72d3/pydantic_core-2.41.5-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d38548150c39b74aeeb0ce8ee1d8e82696f4a4e16ddc6de7b1d8823f7de4b9b5", size = 2171698, upload-time = "2025-11-04T13:40:12.004Z" }, - { url = "https://files.pythonhosted.org/packages/b1/87/41f3202e4193e3bacfc2c065fab7706ebe81af46a83d3e27605029c1f5a6/pydantic_core-2.41.5-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:c23e27686783f60290e36827f9c626e63154b82b116d7fe9adba1fda36da706c", size = 2132603, upload-time = "2025-11-04T13:40:13.868Z" }, - { url = "https://files.pythonhosted.org/packages/49/7d/4c00df99cb12070b6bccdef4a195255e6020a550d572768d92cc54dba91a/pydantic_core-2.41.5-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:482c982f814460eabe1d3bb0adfdc583387bd4691ef00b90575ca0d2b6fe2294", size = 2329591, upload-time = "2025-11-04T13:40:15.672Z" }, - { url = "https://files.pythonhosted.org/packages/cc/6a/ebf4b1d65d458f3cda6a7335d141305dfa19bdc61140a884d165a8a1bbc7/pydantic_core-2.41.5-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:bfea2a5f0b4d8d43adf9d7b8bf019fb46fdd10a2e5cde477fbcb9d1fa08c68e1", size = 2319068, upload-time = "2025-11-04T13:40:17.532Z" }, - { url = "https://files.pythonhosted.org/packages/49/3b/774f2b5cd4192d5ab75870ce4381fd89cf218af999515baf07e7206753f0/pydantic_core-2.41.5-cp312-cp312-win32.whl", hash = "sha256:b74557b16e390ec12dca509bce9264c3bbd128f8a2c376eaa68003d7f327276d", size = 1985908, upload-time = "2025-11-04T13:40:19.309Z" }, - { url = "https://files.pythonhosted.org/packages/86/45/00173a033c801cacf67c190fef088789394feaf88a98a7035b0e40d53dc9/pydantic_core-2.41.5-cp312-cp312-win_amd64.whl", hash = "sha256:1962293292865bca8e54702b08a4f26da73adc83dd1fcf26fbc875b35d81c815", size = 2020145, upload-time = "2025-11-04T13:40:21.548Z" }, - { url = "https://files.pythonhosted.org/packages/f9/22/91fbc821fa6d261b376a3f73809f907cec5ca6025642c463d3488aad22fb/pydantic_core-2.41.5-cp312-cp312-win_arm64.whl", hash = "sha256:1746d4a3d9a794cacae06a5eaaccb4b8643a131d45fbc9af23e353dc0a5ba5c3", size = 1976179, upload-time = "2025-11-04T13:40:23.393Z" }, - { url = "https://files.pythonhosted.org/packages/09/32/59b0c7e63e277fa7911c2fc70ccfb45ce4b98991e7ef37110663437005af/pydantic_core-2.41.5-graalpy312-graalpy250_312_native-macosx_10_12_x86_64.whl", hash = "sha256:7da7087d756b19037bc2c06edc6c170eeef3c3bafcb8f532ff17d64dc427adfd", size = 2110495, upload-time = "2025-11-04T13:42:49.689Z" }, - { url = "https://files.pythonhosted.org/packages/aa/81/05e400037eaf55ad400bcd318c05bb345b57e708887f07ddb2d20e3f0e98/pydantic_core-2.41.5-graalpy312-graalpy250_312_native-macosx_11_0_arm64.whl", hash = "sha256:aabf5777b5c8ca26f7824cb4a120a740c9588ed58df9b2d196ce92fba42ff8dc", size = 1915388, upload-time = "2025-11-04T13:42:52.215Z" }, - { url = "https://files.pythonhosted.org/packages/6e/0d/e3549b2399f71d56476b77dbf3cf8937cec5cd70536bdc0e374a421d0599/pydantic_core-2.41.5-graalpy312-graalpy250_312_native-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c007fe8a43d43b3969e8469004e9845944f1a80e6acd47c150856bb87f230c56", size = 1942879, upload-time = "2025-11-04T13:42:56.483Z" }, - { url = "https://files.pythonhosted.org/packages/f7/07/34573da085946b6a313d7c42f82f16e8920bfd730665de2d11c0c37a74b5/pydantic_core-2.41.5-graalpy312-graalpy250_312_native-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:76d0819de158cd855d1cbb8fcafdf6f5cf1eb8e470abe056d5d161106e38062b", size = 2139017, upload-time = "2025-11-04T13:42:59.471Z" }, + { url = "https://files.pythonhosted.org/packages/87/06/8806241ff1f70d9939f9af039c6c35f2360cf16e93c2ca76f184e76b1564/pydantic_core-2.41.5-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:941103c9be18ac8daf7b7adca8228f8ed6bb7a1849020f643b3a14d15b1924d9", size = 2120403, upload-time = "2025-11-04T13:40:25.248Z" }, + { url = "https://files.pythonhosted.org/packages/94/02/abfa0e0bda67faa65fef1c84971c7e45928e108fe24333c81f3bfe35d5f5/pydantic_core-2.41.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:112e305c3314f40c93998e567879e887a3160bb8689ef3d2c04b6cc62c33ac34", size = 1896206, upload-time = "2025-11-04T13:40:27.099Z" }, + { url = "https://files.pythonhosted.org/packages/15/df/a4c740c0943e93e6500f9eb23f4ca7ec9bf71b19e608ae5b579678c8d02f/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cbaad15cb0c90aa221d43c00e77bb33c93e8d36e0bf74760cd00e732d10a6a0", size = 1919307, upload-time = "2025-11-04T13:40:29.806Z" }, + { url = "https://files.pythonhosted.org/packages/9a/e3/6324802931ae1d123528988e0e86587c2072ac2e5394b4bc2bc34b61ff6e/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:03ca43e12fab6023fc79d28ca6b39b05f794ad08ec2feccc59a339b02f2b3d33", size = 2063258, upload-time = "2025-11-04T13:40:33.544Z" }, + { url = "https://files.pythonhosted.org/packages/c9/d4/2230d7151d4957dd79c3044ea26346c148c98fbf0ee6ebd41056f2d62ab5/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:dc799088c08fa04e43144b164feb0c13f9a0bc40503f8df3e9fde58a3c0c101e", size = 2214917, upload-time = "2025-11-04T13:40:35.479Z" }, + { url = "https://files.pythonhosted.org/packages/e6/9f/eaac5df17a3672fef0081b6c1bb0b82b33ee89aa5cec0d7b05f52fd4a1fa/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:97aeba56665b4c3235a0e52b2c2f5ae9cd071b8a8310ad27bddb3f7fb30e9aa2", size = 2332186, upload-time = "2025-11-04T13:40:37.436Z" }, + { url = "https://files.pythonhosted.org/packages/cf/4e/35a80cae583a37cf15604b44240e45c05e04e86f9cfd766623149297e971/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:406bf18d345822d6c21366031003612b9c77b3e29ffdb0f612367352aab7d586", size = 2073164, upload-time = "2025-11-04T13:40:40.289Z" }, + { url = "https://files.pythonhosted.org/packages/bf/e3/f6e262673c6140dd3305d144d032f7bd5f7497d3871c1428521f19f9efa2/pydantic_core-2.41.5-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b93590ae81f7010dbe380cdeab6f515902ebcbefe0b9327cc4804d74e93ae69d", size = 2179146, upload-time = "2025-11-04T13:40:42.809Z" }, + { url = "https://files.pythonhosted.org/packages/75/c7/20bd7fc05f0c6ea2056a4565c6f36f8968c0924f19b7d97bbfea55780e73/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:01a3d0ab748ee531f4ea6c3e48ad9dac84ddba4b0d82291f87248f2f9de8d740", size = 2137788, upload-time = "2025-11-04T13:40:44.752Z" }, + { url = "https://files.pythonhosted.org/packages/3a/8d/34318ef985c45196e004bc46c6eab2eda437e744c124ef0dbe1ff2c9d06b/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:6561e94ba9dacc9c61bce40e2d6bdc3bfaa0259d3ff36ace3b1e6901936d2e3e", size = 2340133, upload-time = "2025-11-04T13:40:46.66Z" }, + { url = "https://files.pythonhosted.org/packages/9c/59/013626bf8c78a5a5d9350d12e7697d3d4de951a75565496abd40ccd46bee/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:915c3d10f81bec3a74fbd4faebe8391013ba61e5a1a8d48c4455b923bdda7858", size = 2324852, upload-time = "2025-11-04T13:40:48.575Z" }, + { url = "https://files.pythonhosted.org/packages/1a/d9/c248c103856f807ef70c18a4f986693a46a8ffe1602e5d361485da502d20/pydantic_core-2.41.5-cp313-cp313-win32.whl", hash = "sha256:650ae77860b45cfa6e2cdafc42618ceafab3a2d9a3811fcfbd3bbf8ac3c40d36", size = 1994679, upload-time = "2025-11-04T13:40:50.619Z" }, + { url = "https://files.pythonhosted.org/packages/9e/8b/341991b158ddab181cff136acd2552c9f35bd30380422a639c0671e99a91/pydantic_core-2.41.5-cp313-cp313-win_amd64.whl", hash = "sha256:79ec52ec461e99e13791ec6508c722742ad745571f234ea6255bed38c6480f11", size = 2019766, upload-time = "2025-11-04T13:40:52.631Z" }, + { url = "https://files.pythonhosted.org/packages/73/7d/f2f9db34af103bea3e09735bb40b021788a5e834c81eedb541991badf8f5/pydantic_core-2.41.5-cp313-cp313-win_arm64.whl", hash = "sha256:3f84d5c1b4ab906093bdc1ff10484838aca54ef08de4afa9de0f5f14d69639cd", size = 1981005, upload-time = "2025-11-04T13:40:54.734Z" }, ] [[package]] name = "pydantic-settings" -version = "2.12.0" +version = "2.13.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pydantic" }, { name = "python-dotenv" }, { name = "typing-inspection" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/43/4b/ac7e0aae12027748076d72a8764ff1c9d82ca75a7a52622e67ed3f765c54/pydantic_settings-2.12.0.tar.gz", hash = "sha256:005538ef951e3c2a68e1c08b292b5f2e71490def8589d4221b95dab00dafcfd0", size = 194184, upload-time = "2025-11-10T14:25:47.013Z" } +sdist = { url = "https://files.pythonhosted.org/packages/52/6d/fffca34caecc4a3f97bda81b2098da5e8ab7efc9a66e819074a11955d87e/pydantic_settings-2.13.1.tar.gz", hash = "sha256:b4c11847b15237fb0171e1462bf540e294affb9b86db4d9aa5c01730bdbe4025", size = 223826, upload-time = "2026-02-19T13:45:08.055Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/c1/60/5d4751ba3f4a40a6891f24eec885f51afd78d208498268c734e256fb13c4/pydantic_settings-2.12.0-py3-none-any.whl", hash = "sha256:fddb9fd99a5b18da837b29710391e945b1e30c135477f484084ee513adb93809", size = 51880, upload-time = "2025-11-10T14:25:45.546Z" }, + { url = "https://files.pythonhosted.org/packages/00/4b/ccc026168948fec4f7555b9164c724cf4125eac006e176541483d2c959be/pydantic_settings-2.13.1-py3-none-any.whl", hash = "sha256:d56fd801823dbeae7f0975e1f8c8e25c258eb75d278ea7abb5d9cebb01b56237", size = 58929, upload-time = "2026-02-19T13:45:06.034Z" }, ] [package.optional-dependencies] @@ -332,6 +430,31 @@ yaml = [ { name = "pyyaml" }, ] +[[package]] +name = "pygments" +version = "2.19.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b0/77/a5b8c569bf593b0140bde72ea885a803b82086995367bf2037de0159d924/pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887", size = 4968631, upload-time = "2025-06-21T13:39:12.283Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" }, +] + +[[package]] +name = "pytest" +version = "9.0.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, + { name = "iniconfig" }, + { name = "packaging" }, + { name = "pluggy" }, + { name = "pygments" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/d1/db/7ef3487e0fb0049ddb5ce41d3a49c235bf9ad299b6a25d5780a89f19230f/pytest-9.0.2.tar.gz", hash = "sha256:75186651a92bd89611d1d9fc20f0b4345fd827c41ccd5c299a868a05d70edf11", size = 1568901, upload-time = "2025-12-06T21:30:51.014Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3b/ab/b3226f0bd7cdcf710fbede2b3548584366da3b19b5021e74f5bde2a8fa3f/pytest-9.0.2-py3-none-any.whl", hash = "sha256:711ffd45bf766d5264d487b917733b453d917afd2b0ad65223959f59089f875b", size = 374801, upload-time = "2025-12-06T21:30:49.154Z" }, +] + [[package]] name = "python-dotenv" version = "1.2.1" @@ -347,41 +470,41 @@ version = "6.0.3" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/05/8e/961c0007c59b8dd7729d542c61a4d537767a59645b82a0b521206e1e25c2/pyyaml-6.0.3.tar.gz", hash = "sha256:d76623373421df22fb4cf8817020cbb7ef15c725b9d5e45f17e189bfc384190f", size = 130960, upload-time = "2025-09-25T21:33:16.546Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/d1/33/422b98d2195232ca1826284a76852ad5a86fe23e31b009c9886b2d0fb8b2/pyyaml-6.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7f047e29dcae44602496db43be01ad42fc6f1cc0d8cd6c83d342306c32270196", size = 182063, upload-time = "2025-09-25T21:32:11.445Z" }, - { url = "https://files.pythonhosted.org/packages/89/a0/6cf41a19a1f2f3feab0e9c0b74134aa2ce6849093d5517a0c550fe37a648/pyyaml-6.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:fc09d0aa354569bc501d4e787133afc08552722d3ab34836a80547331bb5d4a0", size = 173973, upload-time = "2025-09-25T21:32:12.492Z" }, - { url = "https://files.pythonhosted.org/packages/ed/23/7a778b6bd0b9a8039df8b1b1d80e2e2ad78aa04171592c8a5c43a56a6af4/pyyaml-6.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9149cad251584d5fb4981be1ecde53a1ca46c891a79788c0df828d2f166bda28", size = 775116, upload-time = "2025-09-25T21:32:13.652Z" }, - { url = "https://files.pythonhosted.org/packages/65/30/d7353c338e12baef4ecc1b09e877c1970bd3382789c159b4f89d6a70dc09/pyyaml-6.0.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5fdec68f91a0c6739b380c83b951e2c72ac0197ace422360e6d5a959d8d97b2c", size = 844011, upload-time = "2025-09-25T21:32:15.21Z" }, - { url = "https://files.pythonhosted.org/packages/8b/9d/b3589d3877982d4f2329302ef98a8026e7f4443c765c46cfecc8858c6b4b/pyyaml-6.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ba1cc08a7ccde2d2ec775841541641e4548226580ab850948cbfda66a1befcdc", size = 807870, upload-time = "2025-09-25T21:32:16.431Z" }, - { url = "https://files.pythonhosted.org/packages/05/c0/b3be26a015601b822b97d9149ff8cb5ead58c66f981e04fedf4e762f4bd4/pyyaml-6.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8dc52c23056b9ddd46818a57b78404882310fb473d63f17b07d5c40421e47f8e", size = 761089, upload-time = "2025-09-25T21:32:17.56Z" }, - { url = "https://files.pythonhosted.org/packages/be/8e/98435a21d1d4b46590d5459a22d88128103f8da4c2d4cb8f14f2a96504e1/pyyaml-6.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:41715c910c881bc081f1e8872880d3c650acf13dfa8214bad49ed4cede7c34ea", size = 790181, upload-time = "2025-09-25T21:32:18.834Z" }, - { url = "https://files.pythonhosted.org/packages/74/93/7baea19427dcfbe1e5a372d81473250b379f04b1bd3c4c5ff825e2327202/pyyaml-6.0.3-cp312-cp312-win32.whl", hash = "sha256:96b533f0e99f6579b3d4d4995707cf36df9100d67e0c8303a0c55b27b5f99bc5", size = 137658, upload-time = "2025-09-25T21:32:20.209Z" }, - { url = "https://files.pythonhosted.org/packages/86/bf/899e81e4cce32febab4fb42bb97dcdf66bc135272882d1987881a4b519e9/pyyaml-6.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:5fcd34e47f6e0b794d17de1b4ff496c00986e1c83f7ab2fb8fcfe9616ff7477b", size = 154003, upload-time = "2025-09-25T21:32:21.167Z" }, - { url = "https://files.pythonhosted.org/packages/1a/08/67bd04656199bbb51dbed1439b7f27601dfb576fb864099c7ef0c3e55531/pyyaml-6.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:64386e5e707d03a7e172c0701abfb7e10f0fb753ee1d773128192742712a98fd", size = 140344, upload-time = "2025-09-25T21:32:22.617Z" }, + { url = "https://files.pythonhosted.org/packages/d1/11/0fd08f8192109f7169db964b5707a2f1e8b745d4e239b784a5a1dd80d1db/pyyaml-6.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8da9669d359f02c0b91ccc01cac4a67f16afec0dac22c2ad09f46bee0697eba8", size = 181669, upload-time = "2025-09-25T21:32:23.673Z" }, + { url = "https://files.pythonhosted.org/packages/b1/16/95309993f1d3748cd644e02e38b75d50cbc0d9561d21f390a76242ce073f/pyyaml-6.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2283a07e2c21a2aa78d9c4442724ec1eb15f5e42a723b99cb3d822d48f5f7ad1", size = 173252, upload-time = "2025-09-25T21:32:25.149Z" }, + { url = "https://files.pythonhosted.org/packages/50/31/b20f376d3f810b9b2371e72ef5adb33879b25edb7a6d072cb7ca0c486398/pyyaml-6.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee2922902c45ae8ccada2c5b501ab86c36525b883eff4255313a253a3160861c", size = 767081, upload-time = "2025-09-25T21:32:26.575Z" }, + { url = "https://files.pythonhosted.org/packages/49/1e/a55ca81e949270d5d4432fbbd19dfea5321eda7c41a849d443dc92fd1ff7/pyyaml-6.0.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a33284e20b78bd4a18c8c2282d549d10bc8408a2a7ff57653c0cf0b9be0afce5", size = 841159, upload-time = "2025-09-25T21:32:27.727Z" }, + { url = "https://files.pythonhosted.org/packages/74/27/e5b8f34d02d9995b80abcef563ea1f8b56d20134d8f4e5e81733b1feceb2/pyyaml-6.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0f29edc409a6392443abf94b9cf89ce99889a1dd5376d94316ae5145dfedd5d6", size = 801626, upload-time = "2025-09-25T21:32:28.878Z" }, + { url = "https://files.pythonhosted.org/packages/f9/11/ba845c23988798f40e52ba45f34849aa8a1f2d4af4b798588010792ebad6/pyyaml-6.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f7057c9a337546edc7973c0d3ba84ddcdf0daa14533c2065749c9075001090e6", size = 753613, upload-time = "2025-09-25T21:32:30.178Z" }, + { url = "https://files.pythonhosted.org/packages/3d/e0/7966e1a7bfc0a45bf0a7fb6b98ea03fc9b8d84fa7f2229e9659680b69ee3/pyyaml-6.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:eda16858a3cab07b80edaf74336ece1f986ba330fdb8ee0d6c0d68fe82bc96be", size = 794115, upload-time = "2025-09-25T21:32:31.353Z" }, + { url = "https://files.pythonhosted.org/packages/de/94/980b50a6531b3019e45ddeada0626d45fa85cbe22300844a7983285bed3b/pyyaml-6.0.3-cp313-cp313-win32.whl", hash = "sha256:d0eae10f8159e8fdad514efdc92d74fd8d682c933a6dd088030f3834bc8e6b26", size = 137427, upload-time = "2025-09-25T21:32:32.58Z" }, + { url = "https://files.pythonhosted.org/packages/97/c9/39d5b874e8b28845e4ec2202b5da735d0199dbe5b8fb85f91398814a9a46/pyyaml-6.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:79005a0d97d5ddabfeeea4cf676af11e647e41d81c9a7722a193022accdb6b7c", size = 154090, upload-time = "2025-09-25T21:32:33.659Z" }, + { url = "https://files.pythonhosted.org/packages/73/e8/2bdf3ca2090f68bb3d75b44da7bbc71843b19c9f2b9cb9b0f4ab7a5a4329/pyyaml-6.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:5498cd1645aa724a7c71c8f378eb29ebe23da2fc0d7a08071d89469bf1d2defb", size = 140246, upload-time = "2025-09-25T21:32:34.663Z" }, ] [[package]] name = "ruff" -version = "0.15.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/04/dc/4e6ac71b511b141cf626357a3946679abeba4cf67bc7cc5a17920f31e10d/ruff-0.15.1.tar.gz", hash = "sha256:c590fe13fb57c97141ae975c03a1aedb3d3156030cabd740d6ff0b0d601e203f", size = 4540855, upload-time = "2026-02-12T23:09:09.998Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/23/bf/e6e4324238c17f9d9120a9d60aa99a7daaa21204c07fcd84e2ef03bb5fd1/ruff-0.15.1-py3-none-linux_armv6l.whl", hash = "sha256:b101ed7cf4615bda6ffe65bdb59f964e9f4a0d3f85cbf0e54f0ab76d7b90228a", size = 10367819, upload-time = "2026-02-12T23:09:03.598Z" }, - { url = "https://files.pythonhosted.org/packages/b3/ea/c8f89d32e7912269d38c58f3649e453ac32c528f93bb7f4219258be2e7ed/ruff-0.15.1-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:939c995e9277e63ea632cc8d3fae17aa758526f49a9a850d2e7e758bfef46602", size = 10798618, upload-time = "2026-02-12T23:09:22.928Z" }, - { url = "https://files.pythonhosted.org/packages/5e/0f/1d0d88bc862624247d82c20c10d4c0f6bb2f346559d8af281674cf327f15/ruff-0.15.1-py3-none-macosx_11_0_arm64.whl", hash = "sha256:1d83466455fdefe60b8d9c8df81d3c1bbb2115cede53549d3b522ce2bc703899", size = 10148518, upload-time = "2026-02-12T23:08:58.339Z" }, - { url = "https://files.pythonhosted.org/packages/f5/c8/291c49cefaa4a9248e986256df2ade7add79388fe179e0691be06fae6f37/ruff-0.15.1-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a9457e3c3291024866222b96108ab2d8265b477e5b1534c7ddb1810904858d16", size = 10518811, upload-time = "2026-02-12T23:09:31.865Z" }, - { url = "https://files.pythonhosted.org/packages/c3/1a/f5707440e5ae43ffa5365cac8bbb91e9665f4a883f560893829cf16a606b/ruff-0.15.1-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:92c92b003e9d4f7fbd33b1867bb15a1b785b1735069108dfc23821ba045b29bc", size = 10196169, upload-time = "2026-02-12T23:09:17.306Z" }, - { url = "https://files.pythonhosted.org/packages/2a/ff/26ddc8c4da04c8fd3ee65a89c9fb99eaa5c30394269d424461467be2271f/ruff-0.15.1-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1fe5c41ab43e3a06778844c586251eb5a510f67125427625f9eb2b9526535779", size = 10990491, upload-time = "2026-02-12T23:09:25.503Z" }, - { url = "https://files.pythonhosted.org/packages/fc/00/50920cb385b89413f7cdb4bb9bc8fc59c1b0f30028d8bccc294189a54955/ruff-0.15.1-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:66a6dd6df4d80dc382c6484f8ce1bcceb55c32e9f27a8b94c32f6c7331bf14fb", size = 11843280, upload-time = "2026-02-12T23:09:19.88Z" }, - { url = "https://files.pythonhosted.org/packages/5d/6d/2f5cad8380caf5632a15460c323ae326f1e1a2b5b90a6ee7519017a017ca/ruff-0.15.1-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a4a42cbb8af0bda9bcd7606b064d7c0bc311a88d141d02f78920be6acb5aa83", size = 11274336, upload-time = "2026-02-12T23:09:14.907Z" }, - { url = "https://files.pythonhosted.org/packages/a3/1d/5f56cae1d6c40b8a318513599b35ea4b075d7dc1cd1d04449578c29d1d75/ruff-0.15.1-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4ab064052c31dddada35079901592dfba2e05f5b1e43af3954aafcbc1096a5b2", size = 11137288, upload-time = "2026-02-12T23:09:07.475Z" }, - { url = "https://files.pythonhosted.org/packages/cd/20/6f8d7d8f768c93b0382b33b9306b3b999918816da46537d5a61635514635/ruff-0.15.1-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:5631c940fe9fe91f817a4c2ea4e81f47bee3ca4aa646134a24374f3c19ad9454", size = 11070681, upload-time = "2026-02-12T23:08:55.43Z" }, - { url = "https://files.pythonhosted.org/packages/9a/67/d640ac76069f64cdea59dba02af2e00b1fa30e2103c7f8d049c0cff4cafd/ruff-0.15.1-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:68138a4ba184b4691ccdc39f7795c66b3c68160c586519e7e8444cf5a53e1b4c", size = 10486401, upload-time = "2026-02-12T23:09:27.927Z" }, - { url = "https://files.pythonhosted.org/packages/65/3d/e1429f64a3ff89297497916b88c32a5cc88eeca7e9c787072d0e7f1d3e1e/ruff-0.15.1-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:518f9af03bfc33c03bdb4cb63fabc935341bb7f54af500f92ac309ecfbba6330", size = 10197452, upload-time = "2026-02-12T23:09:12.147Z" }, - { url = "https://files.pythonhosted.org/packages/78/83/e2c3bade17dad63bf1e1c2ffaf11490603b760be149e1419b07049b36ef2/ruff-0.15.1-py3-none-musllinux_1_2_i686.whl", hash = "sha256:da79f4d6a826caaea95de0237a67e33b81e6ec2e25fc7e1993a4015dffca7c61", size = 10693900, upload-time = "2026-02-12T23:09:34.418Z" }, - { url = "https://files.pythonhosted.org/packages/a1/27/fdc0e11a813e6338e0706e8b39bb7a1d61ea5b36873b351acee7e524a72a/ruff-0.15.1-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:3dd86dccb83cd7d4dcfac303ffc277e6048600dfc22e38158afa208e8bf94a1f", size = 11227302, upload-time = "2026-02-12T23:09:36.536Z" }, - { url = "https://files.pythonhosted.org/packages/f6/58/ac864a75067dcbd3b95be5ab4eb2b601d7fbc3d3d736a27e391a4f92a5c1/ruff-0.15.1-py3-none-win32.whl", hash = "sha256:660975d9cb49b5d5278b12b03bb9951d554543a90b74ed5d366b20e2c57c2098", size = 10462555, upload-time = "2026-02-12T23:09:29.899Z" }, - { url = "https://files.pythonhosted.org/packages/e0/5e/d4ccc8a27ecdb78116feac4935dfc39d1304536f4296168f91ed3ec00cd2/ruff-0.15.1-py3-none-win_amd64.whl", hash = "sha256:c820fef9dd5d4172a6570e5721704a96c6679b80cf7be41659ed439653f62336", size = 11599956, upload-time = "2026-02-12T23:09:01.157Z" }, - { url = "https://files.pythonhosted.org/packages/2a/07/5bda6a85b220c64c65686bc85bd0bbb23b29c62b3a9f9433fa55f17cda93/ruff-0.15.1-py3-none-win_arm64.whl", hash = "sha256:5ff7d5f0f88567850f45081fac8f4ec212be8d0b963e385c3f7d0d2eb4899416", size = 10874604, upload-time = "2026-02-12T23:09:05.515Z" }, +version = "0.15.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/da/31/d6e536cdebb6568ae75a7f00e4b4819ae0ad2640c3604c305a0428680b0c/ruff-0.15.4.tar.gz", hash = "sha256:3412195319e42d634470cc97aa9803d07e9d5c9223b99bcb1518f0c725f26ae1", size = 4569550, upload-time = "2026-02-26T20:04:14.959Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f2/82/c11a03cfec3a4d26a0ea1e571f0f44be5993b923f905eeddfc397c13d360/ruff-0.15.4-py3-none-linux_armv6l.whl", hash = "sha256:a1810931c41606c686bae8b5b9a8072adac2f611bb433c0ba476acba17a332e0", size = 10453333, upload-time = "2026-02-26T20:04:20.093Z" }, + { url = "https://files.pythonhosted.org/packages/ce/5d/6a1f271f6e31dffb31855996493641edc3eef8077b883eaf007a2f1c2976/ruff-0.15.4-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:5a1632c66672b8b4d3e1d1782859e98d6e0b4e70829530666644286600a33992", size = 10853356, upload-time = "2026-02-26T20:04:05.808Z" }, + { url = "https://files.pythonhosted.org/packages/b1/d8/0fab9f8842b83b1a9c2bf81b85063f65e93fb512e60effa95b0be49bfc54/ruff-0.15.4-py3-none-macosx_11_0_arm64.whl", hash = "sha256:a4386ba2cd6c0f4ff75252845906acc7c7c8e1ac567b7bc3d373686ac8c222ba", size = 10187434, upload-time = "2026-02-26T20:03:54.656Z" }, + { url = "https://files.pythonhosted.org/packages/85/cc/cc220fd9394eff5db8d94dec199eec56dd6c9f3651d8869d024867a91030/ruff-0.15.4-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b2496488bdfd3732747558b6f95ae427ff066d1fcd054daf75f5a50674411e75", size = 10535456, upload-time = "2026-02-26T20:03:52.738Z" }, + { url = "https://files.pythonhosted.org/packages/fa/0f/bced38fa5cf24373ec767713c8e4cadc90247f3863605fb030e597878661/ruff-0.15.4-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3f1c4893841ff2d54cbda1b2860fa3260173df5ddd7b95d370186f8a5e66a4ac", size = 10287772, upload-time = "2026-02-26T20:04:08.138Z" }, + { url = "https://files.pythonhosted.org/packages/2b/90/58a1802d84fed15f8f281925b21ab3cecd813bde52a8ca033a4de8ab0e7a/ruff-0.15.4-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:820b8766bd65503b6c30aaa6331e8ef3a6e564f7999c844e9a547c40179e440a", size = 11049051, upload-time = "2026-02-26T20:04:03.53Z" }, + { url = "https://files.pythonhosted.org/packages/d2/ac/b7ad36703c35f3866584564dc15f12f91cb1a26a897dc2fd13d7cb3ae1af/ruff-0.15.4-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c9fb74bab47139c1751f900f857fa503987253c3ef89129b24ed375e72873e85", size = 11890494, upload-time = "2026-02-26T20:04:10.497Z" }, + { url = "https://files.pythonhosted.org/packages/93/3d/3eb2f47a39a8b0da99faf9c54d3eb24720add1e886a5309d4d1be73a6380/ruff-0.15.4-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f80c98765949c518142b3a50a5db89343aa90f2c2bf7799de9986498ae6176db", size = 11326221, upload-time = "2026-02-26T20:04:12.84Z" }, + { url = "https://files.pythonhosted.org/packages/ff/90/bf134f4c1e5243e62690e09d63c55df948a74084c8ac3e48a88468314da6/ruff-0.15.4-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:451a2e224151729b3b6c9ffb36aed9091b2996fe4bdbd11f47e27d8f2e8888ec", size = 11168459, upload-time = "2026-02-26T20:04:00.969Z" }, + { url = "https://files.pythonhosted.org/packages/b5/e5/a64d27688789b06b5d55162aafc32059bb8c989c61a5139a36e1368285eb/ruff-0.15.4-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:a8f157f2e583c513c4f5f896163a93198297371f34c04220daf40d133fdd4f7f", size = 11104366, upload-time = "2026-02-26T20:03:48.099Z" }, + { url = "https://files.pythonhosted.org/packages/f1/f6/32d1dcb66a2559763fc3027bdd65836cad9eb09d90f2ed6a63d8e9252b02/ruff-0.15.4-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:917cc68503357021f541e69b35361c99387cdbbf99bd0ea4aa6f28ca99ff5338", size = 10510887, upload-time = "2026-02-26T20:03:45.771Z" }, + { url = "https://files.pythonhosted.org/packages/ff/92/22d1ced50971c5b6433aed166fcef8c9343f567a94cf2b9d9089f6aa80fe/ruff-0.15.4-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:e9737c8161da79fd7cfec19f1e35620375bd8b2a50c3e77fa3d2c16f574105cc", size = 10285939, upload-time = "2026-02-26T20:04:22.42Z" }, + { url = "https://files.pythonhosted.org/packages/e6/f4/7c20aec3143837641a02509a4668fb146a642fd1211846634edc17eb5563/ruff-0.15.4-py3-none-musllinux_1_2_i686.whl", hash = "sha256:291258c917539e18f6ba40482fe31d6f5ac023994ee11d7bdafd716f2aab8a68", size = 10765471, upload-time = "2026-02-26T20:03:58.924Z" }, + { url = "https://files.pythonhosted.org/packages/d0/09/6d2f7586f09a16120aebdff8f64d962d7c4348313c77ebb29c566cefc357/ruff-0.15.4-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:3f83c45911da6f2cd5936c436cf86b9f09f09165f033a99dcf7477e34041cbc3", size = 11263382, upload-time = "2026-02-26T20:04:24.424Z" }, + { url = "https://files.pythonhosted.org/packages/1b/fa/2ef715a1cd329ef47c1a050e10dee91a9054b7ce2fcfdd6a06d139afb7ec/ruff-0.15.4-py3-none-win32.whl", hash = "sha256:65594a2d557d4ee9f02834fcdf0a28daa8b3b9f6cb2cb93846025a36db47ef22", size = 10506664, upload-time = "2026-02-26T20:03:50.56Z" }, + { url = "https://files.pythonhosted.org/packages/d0/a8/c688ef7e29983976820d18710f955751d9f4d4eb69df658af3d006e2ba3e/ruff-0.15.4-py3-none-win_amd64.whl", hash = "sha256:04196ad44f0df220c2ece5b0e959c2f37c777375ec744397d21d15b50a75264f", size = 11651048, upload-time = "2026-02-26T20:04:17.191Z" }, + { url = "https://files.pythonhosted.org/packages/3e/0a/9e1be9035b37448ce2e68c978f0591da94389ade5a5abafa4cf99985d1b2/ruff-0.15.4-py3-none-win_arm64.whl", hash = "sha256:60d5177e8cfc70e51b9c5fad936c634872a74209f934c1e79107d11787ad5453", size = 10966776, upload-time = "2026-02-26T20:03:56.908Z" }, ] [[package]] @@ -390,7 +513,6 @@ version = "0.52.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "anyio" }, - { name = "typing-extensions" }, ] sdist = { url = "https://files.pythonhosted.org/packages/c4/68/79977123bb7be889ad680d79a40f339082c1978b5cfcf62c2d8d196873ac/starlette-0.52.1.tar.gz", hash = "sha256:834edd1b0a23167694292e94f597773bc3f89f362be6effee198165a35d62933", size = 2653702, upload-time = "2026-01-18T13:34:11.062Z" } wheels = [ @@ -420,15 +542,15 @@ wheels = [ [[package]] name = "uvicorn" -version = "0.40.0" +version = "0.41.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "click" }, { name = "h11" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/c3/d1/8f3c683c9561a4e6689dd3b1d345c815f10f86acd044ee1fb9a4dcd0b8c5/uvicorn-0.40.0.tar.gz", hash = "sha256:839676675e87e73694518b5574fd0f24c9d97b46bea16df7b8c05ea1a51071ea", size = 81761, upload-time = "2025-12-21T14:16:22.45Z" } +sdist = { url = "https://files.pythonhosted.org/packages/32/ce/eeb58ae4ac36fe09e3842eb02e0eb676bf2c53ae062b98f1b2531673efdd/uvicorn-0.41.0.tar.gz", hash = "sha256:09d11cf7008da33113824ee5a1c6422d89fbc2ff476540d69a34c87fab8b571a", size = 82633, upload-time = "2026-02-16T23:07:24.1Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/3d/d8/2083a1daa7439a66f3a48589a57d576aa117726762618f6bb09fe3798796/uvicorn-0.40.0-py3-none-any.whl", hash = "sha256:c6c8f55bc8bf13eb6fa9ff87ad62308bbbc33d0b67f84293151efe87e0d5f2ee", size = 68502, upload-time = "2025-12-21T14:16:21.041Z" }, + { url = "https://files.pythonhosted.org/packages/83/e4/d04a086285c20886c0daad0e026f250869201013d18f81d9ff5eada73a88/uvicorn-0.41.0-py3-none-any.whl", hash = "sha256:29e35b1d2c36a04b9e180d4007ede3bcb32a85fbdfd6c6aeb3f26839de088187", size = 68783, upload-time = "2026-02-16T23:07:22.357Z" }, ] [[package]] @@ -437,12 +559,12 @@ version = "0.22.1" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/06/f0/18d39dbd1971d6d62c4629cc7fa67f74821b0dc1f5a77af43719de7936a7/uvloop-0.22.1.tar.gz", hash = "sha256:6c84bae345b9147082b17371e3dd5d42775bddce91f885499017f4607fdaf39f", size = 2443250, upload-time = "2025-10-16T22:17:19.342Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/3d/ff/7f72e8170be527b4977b033239a83a68d5c881cc4775fca255c677f7ac5d/uvloop-0.22.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:fe94b4564e865d968414598eea1a6de60adba0c040ba4ed05ac1300de402cd42", size = 1359936, upload-time = "2025-10-16T22:16:29.436Z" }, - { url = "https://files.pythonhosted.org/packages/c3/c6/e5d433f88fd54d81ef4be58b2b7b0cea13c442454a1db703a1eea0db1a59/uvloop-0.22.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:51eb9bd88391483410daad430813d982010f9c9c89512321f5b60e2cddbdddd6", size = 752769, upload-time = "2025-10-16T22:16:30.493Z" }, - { url = "https://files.pythonhosted.org/packages/24/68/a6ac446820273e71aa762fa21cdcc09861edd3536ff47c5cd3b7afb10eeb/uvloop-0.22.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:700e674a166ca5778255e0e1dc4e9d79ab2acc57b9171b79e65feba7184b3370", size = 4317413, upload-time = "2025-10-16T22:16:31.644Z" }, - { url = "https://files.pythonhosted.org/packages/5f/6f/e62b4dfc7ad6518e7eff2516f680d02a0f6eb62c0c212e152ca708a0085e/uvloop-0.22.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7b5b1ac819a3f946d3b2ee07f09149578ae76066d70b44df3fa990add49a82e4", size = 4426307, upload-time = "2025-10-16T22:16:32.917Z" }, - { url = "https://files.pythonhosted.org/packages/90/60/97362554ac21e20e81bcef1150cb2a7e4ffdaf8ea1e5b2e8bf7a053caa18/uvloop-0.22.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:e047cc068570bac9866237739607d1313b9253c3051ad84738cbb095be0537b2", size = 4131970, upload-time = "2025-10-16T22:16:34.015Z" }, - { url = "https://files.pythonhosted.org/packages/99/39/6b3f7d234ba3964c428a6e40006340f53ba37993f46ed6e111c6e9141d18/uvloop-0.22.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:512fec6815e2dd45161054592441ef76c830eddaad55c8aa30952e6fe1ed07c0", size = 4296343, upload-time = "2025-10-16T22:16:35.149Z" }, + { url = "https://files.pythonhosted.org/packages/89/8c/182a2a593195bfd39842ea68ebc084e20c850806117213f5a299dfc513d9/uvloop-0.22.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:561577354eb94200d75aca23fbde86ee11be36b00e52a4eaf8f50fb0c86b7705", size = 1358611, upload-time = "2025-10-16T22:16:36.833Z" }, + { url = "https://files.pythonhosted.org/packages/d2/14/e301ee96a6dc95224b6f1162cd3312f6d1217be3907b79173b06785f2fe7/uvloop-0.22.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:1cdf5192ab3e674ca26da2eada35b288d2fa49fdd0f357a19f0e7c4e7d5077c8", size = 751811, upload-time = "2025-10-16T22:16:38.275Z" }, + { url = "https://files.pythonhosted.org/packages/b7/02/654426ce265ac19e2980bfd9ea6590ca96a56f10c76e63801a2df01c0486/uvloop-0.22.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6e2ea3d6190a2968f4a14a23019d3b16870dd2190cd69c8180f7c632d21de68d", size = 4288562, upload-time = "2025-10-16T22:16:39.375Z" }, + { url = "https://files.pythonhosted.org/packages/15/c0/0be24758891ef825f2065cd5db8741aaddabe3e248ee6acc5e8a80f04005/uvloop-0.22.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0530a5fbad9c9e4ee3f2b33b148c6a64d47bbad8000ea63704fa8260f4cf728e", size = 4366890, upload-time = "2025-10-16T22:16:40.547Z" }, + { url = "https://files.pythonhosted.org/packages/d2/53/8369e5219a5855869bcee5f4d317f6da0e2c669aecf0ef7d371e3d084449/uvloop-0.22.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:bc5ef13bbc10b5335792360623cc378d52d7e62c2de64660616478c32cd0598e", size = 4119472, upload-time = "2025-10-16T22:16:41.694Z" }, + { url = "https://files.pythonhosted.org/packages/f8/ba/d69adbe699b768f6b29a5eec7b47dd610bd17a69de51b251126a801369ea/uvloop-0.22.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:1f38ec5e3f18c8a10ded09742f7fb8de0108796eb673f30ce7762ce1b8550cad", size = 4239051, upload-time = "2025-10-16T22:16:43.224Z" }, ] [[package]]