Skip to content

add AgentLoomBackend guarded by optional agentloom[contracts,observability] dependency #9

@cchinchilla-dev

Description

@cchinchilla-dev

Description

The positions AgentLoom as the recommended runtime for users who want resilience, observability, and budget enforcement. AgentAnvil must expose AgentLoomBackend as an LLMBackend implementation, but only when agentloom is installed. The portability invariant (#022) requires that pip install agentanvil with no extras still imports and runs — so AgentLoomBackend must fail to import gracefully.

Three concrete requirements:

1. Optional dependency. agentanvil[agentloom] pins agentloom[contracts,observability]>=0.5.0. Without the extra, AgentLoomBackend raises a clear error on instantiation, not on import.

2. Trace-parsing via agentloom.contracts. Use SpanAttr, MetricName enums from AgentLoom 0.5.0 to parse emitted OTel spans into AgentAnvil's canonical Trace (from #5). If agentloom.contracts does not ship Pydantic trace models (decision point D1), use raw span dict parsing.

3. Budget enforcement passthrough. Users set max_cost_usd on the backend; AgentLoom's budget tracker enforces it and AgentAnvil reports partial results if the budget is breached.

Proposal

1. Module skeleton:

# src/agentanvil/backends/agentloom.py
try:
    from agentloom import ProviderGateway
    from agentloom.contracts import SpanAttr, MetricName
    _AGENTLOOM_AVAILABLE = True
except ImportError:
    _AGENTLOOM_AVAILABLE = False


class AgentLoomBackend(LLMBackend):
    name = "agentloom"

    def __init__(self, gateway: "ProviderGateway", *, max_cost_usd: Decimal | None = None):
        if not _AGENTLOOM_AVAILABLE:
            raise ImportError(
                "AgentLoomBackend requires: pip install 'agentanvil[agentloom]'"
            )
        self._gateway = gateway
        self._max_cost_usd = max_cost_usd

    async def complete(self, messages, *, model, **kwargs) -> LLMResponse:
        # Map AgentAnvil Message → AgentLoom's message format.
        # Call gateway.complete(...).
        # Parse resulting span via agentloom.contracts.SpanAttr.
        # Return AgentAnvil LLMResponse with usage, cost, latency, reasoning tokens.
        ...

2. Trace adapter:

# src/agentanvil/backends/_agentloom_adapter.py
def span_to_llm_response(span_attrs: dict) -> LLMResponse:
    """Parse AgentLoom OTel span attributes into AgentAnvil LLMResponse.

    Uses SpanAttr constants (safe if agentloom.contracts schema changes):
    - SpanAttr.GEN_AI_USAGE_INPUT_TOKENS
    - SpanAttr.GEN_AI_USAGE_OUTPUT_TOKENS
    - SpanAttr.GEN_AI_SYSTEM
    - SpanAttr.GEN_AI_REQUEST_MODEL
    - SpanAttr.STEP_COST_USD
    - SpanAttr.STEP_DURATION_MS
    - ...
    """

3. Tests under optional extras:

  • Base CI runs without agentloom: import of agentanvil.backends.agentloom module succeeds, instantiation raises ImportError.
  • Extra CI job installs agentanvil[agentloom]: full integration tests run.

Scope

  • src/agentanvil/backends/agentloom.py — new.
  • src/agentanvil/backends/_agentloom_adapter.py — new.
  • pyproject.toml — confirm agentloom = ["agentloom[contracts,observability]>=0.5.0"] extra is declared.
  • tests/backends/test_agentloom_import_without_extra.py — portability invariant.
  • tests/backends/test_agentloom_with_extra.py — integration (only run when extra installed).
  • .github/workflows/ci.yml — add agentloom-integration job with optional extra.

Regression tests

  • test_agentloom_backend_without_extra_raises_importerror_on_instantiation
  • test_agentloom_backend_module_imports_without_extra (portability invariant)
  • test_span_to_llm_response_parses_gen_ai_attrs
  • test_span_to_llm_response_handles_missing_reasoning_tokens
  • test_agentloom_backend_contract_passes_abc_suite (with extra installed)

Notes

  • Decision point D1: if agentloom.contracts ships Pydantic trace models in 0.5.0, the adapter consumes them directly; otherwise we parse span dicts. Either way, SpanAttr enum names are the stable surface.
  • Decision point D7 (the planning notes, closed 2026-04-22): agentloom.contracts ships with explicit .stable / .experimental split. The backend consumes Message, ToolDef, LLMResponse, Usage, RunContext from agentloom.contracts.stable (semver-guaranteed); trace span names and metric enums come from agentloom.contracts.experimental (may break in minor releases with programmatic warning). This mitigates D1 risk — even if Pydantic trace models shift, the adapter's stable imports remain bound only to the six core types in .stable.
  • Do not import AgentLoom at module top-level — always inside try/except ImportError.
  • Depends on: chore(meta): harden version-linearity script and ci pin #2 (LLMBackend ABC), AgentLoom 0.5.0 availability.
  • Blocks: #022 (portability invariant CI job must see AgentLoomBackend raising gracefully).

Metadata

Metadata

Assignees

No one assigned

    Labels

    agentloomAgentLoom integration (optional extra)backendsLLM backend implementations (Direct, AgentLoom, Mock)enhancementNew feature or request

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions