From f14b2fd41fbb4ff5f63e5768ae051d0fd25fa084 Mon Sep 17 00:00:00 2001 From: Automaker Date: Sun, 19 Apr 2026 14:36:08 -0700 Subject: [PATCH 1/4] chore: release v0.2.0 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit First tagged release. Contents of community-improvements project: M1 — Security Hardening (A2A bearer auth, audit redaction, origin verification) M2 — Memory On By Default (session persistence + load-on-start) M3 — Skill Loop (skill-v1 emission + SQLite FTS5 index + curator) Plus: .gitignore cleanup for .automaker-lock + .worktrees, docs coverage of security layer, skill-loop architecture, and new env vars. Manual bump because prepare-release.yml requires GH_PAT secret (not configured). Co-Authored-By: Claude Opus 4.7 (1M context) --- pyproject.toml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/pyproject.toml b/pyproject.toml index e63ed8b..fc92a48 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [project] name = "protoagent" -version = "0.1.0" +version = "0.2.0" description = "protoAgent — LangGraph + A2A template for spawning protoLabs agents" requires-python = ">=3.11" From e7a56208dd68eccf124d5ed2d06d6def5f5d1878 Mon Sep 17 00:00:00 2001 From: Ava Date: Sun, 19 Apr 2026 19:02:33 -0700 Subject: [PATCH 2/4] chore: release v0.2.1 Bug fixes from v0.2.0 smoke testing: - Agent card now advertises bearer scheme when A2A_AUTH_TOKEN is set - Session memory persistence actually fires (moved from unreachable on_session_end to after_agent) - Test suite collects cleanly in fresh Docker env - MemoryMiddleware activates standalone (without knowledge_store) Co-Authored-By: Claude Opus 4.7 (1M context) --- pyproject.toml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/pyproject.toml b/pyproject.toml index c649139..39bdee2 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [project] name = "protoagent" -version = "0.2.0" +version = "0.2.1" description = "protoAgent — LangGraph + A2A template for spawning protoLabs agents" requires-python = ">=3.11" From f1dcd3f368bd226a326dd18e9f9b82a084af77bd Mon Sep 17 00:00:00 2001 From: Josh Mabry <31560031+mabry1985@users.noreply.github.com> Date: Tue, 21 Apr 2026 20:55:50 -0700 Subject: [PATCH 3/4] chore(ci): update repo homepage after docs deploy (#149) Writes the deployed GitHub Pages URL back to the repo's `homepage` field so it renders in the About sidebar on the repo page. Co-authored-by: Automaker Co-authored-by: Claude Opus 4.7 (1M context) --- .github/workflows/docs.yml | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml index a47454c..c619d94 100644 --- a/.github/workflows/docs.yml +++ b/.github/workflows/docs.yml @@ -10,6 +10,7 @@ permissions: contents: read pages: write id-token: write + administration: write concurrency: group: pages @@ -39,3 +40,7 @@ jobs: steps: - id: deployment uses: actions/deploy-pages@v4 + - name: Update repo homepage + run: gh api -X PATCH repos/${{ github.repository }} -f homepage="${{ steps.deployment.outputs.page_url }}" + env: + GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} From 8e2e34a692ffce2e54f3484340b45fb70b1e9752 Mon Sep 17 00:00:00 2001 From: Automaker Date: Wed, 22 Apr 2026 19:35:48 -0700 Subject: [PATCH 4/4] fix(llm): override OpenAI SDK User-Agent to bypass Cloudflare WAF MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Cloudflare's managed WAF on the proto-labs.ai zone returns 403 "Your request was blocked" for any request whose User-Agent starts with `OpenAI/Python` or `AsyncOpenAI/Python` — which is exactly what langchain_openai.ChatOpenAI sends by default via the bundled OpenAI SDK. /v1/models succeeded (different SDK path / UA) while /v1/chat/completions failed, making the break look like a key/ACL issue when it was a header signature match. Reproduction (before fix): curl -H 'User-Agent: OpenAI/Python 1.54.0' -H 'Authorization: Bearer ' \ https://api.proto-labs.ai/v1/chat/completions -d '{...}' -> HTTP 403 "Your request was blocked." The same call with User-Agent: curl/*, python-httpx/*, or any non-OpenAI string returns 200. `tools/lg_tools.py:226` already sets a protoAgent UA for outbound HTTP fetches — reuse that identifier here so every egress presents a consistent, allowlisted UA. Alternative fixes considered: - A Cloudflare Custom WAF Skip rule on the hostname: cleaner at the edge but requires a zone-scoped token and couples agent operability to infra config. - Stripping the UA header at cloudflared: not possible; WAF fires before the tunnel sees the request. The in-client override is the most portable fix: self-hosters on a different edge keep working, operators behind Cloudflare stop getting 403s. --- graph/llm.py | 10 ++++++++++ 1 file changed, 10 insertions(+) diff --git a/graph/llm.py b/graph/llm.py index 70f3fe0..f364cd8 100644 --- a/graph/llm.py +++ b/graph/llm.py @@ -32,4 +32,14 @@ def create_llm(config: LangGraphConfig) -> ChatOpenAI: # AIMessageChunks with usage_metadata=None and we can't emit # the cost-v1 DataPart on the terminal artifact. stream_usage=True, + # Cloudflare's managed WAF blocks the OpenAI SDK's default + # `OpenAI/Python ` User-Agent (observed 403 "Your request + # was blocked" against api.proto-labs.ai). Override with the + # same identifier `tools/lg_tools.py` uses for outbound fetches + # so every protoAgent egress presents a consistent, allowlisted + # UA. If you self-host behind a different edge, this is safe to + # keep. + default_headers={ + "User-Agent": "protoAgent/0.1 (+https://github.com/protoLabsAI/protoAgent)", + }, )