Skip to content

feat: Log failed LLM calls to PromptLayer before re-raising#296

Merged
adagradschool merged 4 commits intomasterfrom
feat/error-tracking-on-llm-failure
Feb 24, 2026
Merged

feat: Log failed LLM calls to PromptLayer before re-raising#296
adagradschool merged 4 commits intomasterfrom
feat/error-tracking-on-llm-failure

Conversation

@adagradschool
Copy link
Contributor

@adagradschool adagradschool commented Feb 23, 2026

Summary

When .run() throws on an LLM call, the SDK now catches the exception, logs it to PromptLayer with status=ERROR, error_type, and error_message, then re-raises. Also adds these fields to log_request() for manual callers. Error categorization is duck-typed — no provider imports in runtime code.

Edge cases tested

  • Tracking failure doesn't mask the original LLM exception
  • Success path excludes error fields
  • status_code=402 quota detection
  • Substring matches ("quota", "timeout") gated to known provider modules only
  • Real OpenAI/Anthropic exception classes

Known concerns

  • Retry latency: Error-tracking in the except block uses track_request/atrack_request which have retry logic. If the PromptLayer API is down, this adds latency before the original exception propagates.
  • throw_on_error interaction: When throw_on_error=True, a failure in the error-tracking call itself (e.g. PromptLayer API rejects the payload) would raise a PromptLayerAPIError inside the inner try/except, which we suppress to re-raise the original LLM error. This is intentional — the LLM error always takes priority — but means tracking failures are only visible at logger.debug level regardless of throw_on_error.

🤖 Generated with Claude Code

Copy link
Contributor

@hasaan21 hasaan21 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Update package version in promptlayer/init.py and pyproject

adagradschool and others added 4 commits February 24, 2026 20:46
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Records a real round-trip: template fetch (success) → OpenAI call
(401 auth failure) → track-request with error fields (success).
Verified against local promptlayer-app server.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Cloudflare tracking cookies (\_\_cf\_bm, \_cfuvid) were being recorded
in VCR cassettes. Added set-cookie/cookie to VCR filter_headers and
stripped existing cookies from the error tracking cassette.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@adagradschool adagradschool force-pushed the feat/error-tracking-on-llm-failure branch from c70b1ce to a8b0251 Compare February 24, 2026 15:16
@adagradschool adagradschool merged commit fb5635b into master Feb 24, 2026
5 checks passed
@adagradschool adagradschool deleted the feat/error-tracking-on-llm-failure branch February 24, 2026 15:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants