feat: add Chartmetric proxy endpoint with credit deduction#318
feat: add Chartmetric proxy endpoint with credit deduction#318sweetmantech wants to merge 3 commits intotestfrom
Conversation
Implements POST/GET /api/chartmetric/[...path] that authenticates via validateAuthContext, deducts 1 credit per call, exchanges the server-side CHARTMETRIC_REFRESH_TOKEN for an access token, and forwards the request to the Chartmetric API. Includes full vitest test coverage (5 tests). Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Chartmetric costs $350/month flat. At 5 credits ($0.05/call), we break even at ~7,000 calls/month (~233/day). A typical research task (6-7 API calls) costs 30-35 credits ($0.30-0.35), which is fair pricing for the data value delivered. At 1 credit/call we would need 35,000 calls/ month to break even — unrealistic at current scale. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
📝 WalkthroughWalkthroughAdds a Chartmetric API proxy: a catch-all Next.js route forwards GET/POST to a proxy that authenticates callers, deducts credits, obtains a server-side Chartmetric token, and forwards requests to the Chartmetric API with CORS-enabled responses and dynamic runtime settings. Changes
Sequence DiagramsequenceDiagram
actor Client
participant RouteHandler as Next.js Route<br/>Handler
participant ProxyFn as proxyChartmetricRequest
participant AuthSys as validateAuthContext
participant CreditSys as deductCredits
participant TokenFn as getChartmetricToken
participant ChartmetricAPI as Chartmetric API
Client->>RouteHandler: GET/POST /api/chartmetric/...
RouteHandler->>ProxyFn: forward request + path params
ProxyFn->>AuthSys: validate caller
alt auth fails
AuthSys-->>ProxyFn: NextResponse (error)
ProxyFn-->>Client: return auth error
else auth succeeds
AuthSys-->>ProxyFn: caller identity
ProxyFn->>CreditSys: deduct credits (5)
alt insufficient credits
CreditSys-->>ProxyFn: error (402)
ProxyFn-->>Client: CORS 402 response
else credits deducted
CreditSys-->>ProxyFn: success
ProxyFn->>TokenFn: request access token
alt token error
TokenFn-->>ProxyFn: error
ProxyFn-->>Client: CORS 500 response
else token received
TokenFn-->>ProxyFn: access_token
ProxyFn->>ChartmetricAPI: fetch(upstream URL, Bearer token, body?)
ChartmetricAPI-->>ProxyFn: response JSON + status
ProxyFn-->>Client: CORS response with upstream status
end
end
end
Estimated Code Review Effort🎯 3 (Moderate) | ⏱️ ~20 minutes Poem
🚥 Pre-merge checks | ❌ 1❌ Failed checks (1 warning)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment Tip You can make CodeRabbit's review stricter and more nitpicky using the `assertive` profile, if that's what you prefer.Change the |
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (3)
app/api/chartmetric/[...path]/route.ts (1)
16-34: Consider adding an OPTIONS handler for CORS preflight.Browser-based clients making cross-origin requests will send a preflight OPTIONS request. While the responses include CORS headers via
getCorsHeaders(), there's no explicit OPTIONS handler to respond to preflight requests.If this API is intended for server-to-server use only, this is fine. But if browser clients will call it directly, an OPTIONS handler is needed.
♻️ Add OPTIONS handler if browser clients are expected
import { getCorsHeaders } from "@/lib/networking/getCorsHeaders"; import { NextResponse } from "next/server"; /** * OPTIONS /api/chartmetric/[...path] * * Handles CORS preflight requests. */ export async function OPTIONS() { return new NextResponse(null, { status: 204, headers: getCorsHeaders() }); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@app/api/chartmetric/`[...path]/route.ts around lines 16 - 34, Add an OPTIONS handler to this route to respond to CORS preflight requests: export an async function OPTIONS() that returns a 204 NextResponse with headers from getCorsHeaders(), and import getCorsHeaders and NextResponse at the top; place it alongside the existing GET/POST handlers (which call proxyChartmetricRequest) so browser clients receive proper CORS preflight responses.lib/chartmetric/proxyChartmetricRequest.ts (2)
92-97: Handle non-JSON upstream responses gracefully.If Chartmetric returns a non-JSON response (e.g., an HTML error page during outages),
chartmetricResponse.json()will throw. While the catch block handles this, the error message will be generic. Consider checkingContent-Typeor wrapping the JSON parse more explicitly.♻️ Suggested improvement
try { const chartmetricResponse = await fetch(chartmetricUrl, { method: request.method, headers: { Authorization: `Bearer ${accessToken}`, "Content-Type": "application/json", }, ...(body ? { body } : {}), }); - const data = await chartmetricResponse.json(); + const contentType = chartmetricResponse.headers.get("content-type") ?? ""; + if (!contentType.includes("application/json")) { + const text = await chartmetricResponse.text(); + return NextResponse.json( + { status: "error", error: `Chartmetric returned non-JSON response: ${text.slice(0, 200)}` }, + { status: 502, headers: getCorsHeaders() }, + ); + } + const data = await chartmetricResponse.json(); return NextResponse.json(data, { status: chartmetricResponse.status, headers: getCorsHeaders(), });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@lib/chartmetric/proxyChartmetricRequest.ts` around lines 92 - 97, The code currently calls chartmetricResponse.json() which will throw on non-JSON responses; update proxyChartmetricRequest to first inspect chartmetricResponse.headers.get('content-type') (or attempt a safe JSON parse) and only call .json() when content-type includes 'application/json'; otherwise call chartmetricResponse.text() and return that body (or a structured fallback error) via NextResponse.json/NextResponse.text with the original chartmetricResponse.status and getCorsHeaders(); ensure chartmetricResponse and getCorsHeaders are the referenced symbols used and keep existing status/headers behavior for all branches.
38-46: String-based error detection is fragile coupling.The check
message.toLowerCase().includes("insufficient credits")couples this code to the exact wording ofdeductCredits's error message. If that message changes, the 402 response breaks silently and becomes a 500.Consider having
deductCreditsthrow a typed error (or return a discriminated result) so consumers can reliably detect insufficient credits without string parsing.♻️ Example: Use a custom error class
In
lib/credits/deductCredits.ts:export class InsufficientCreditsError extends Error { constructor(required: number, available: number) { super(`Insufficient credits. Required: ${required}, Available: ${available}`); this.name = "InsufficientCreditsError"; } }Then in
proxyChartmetricRequest.ts:- } catch (err) { - const message = err instanceof Error ? err.message : String(err); - - if (message.toLowerCase().includes("insufficient credits")) { + } catch (err) { + if (err instanceof InsufficientCreditsError) { return NextResponse.json( { status: "error", error: "Insufficient credits for Chartmetric API call" }, { status: 402, headers: getCorsHeaders() }, ); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@lib/chartmetric/proxyChartmetricRequest.ts` around lines 38 - 46, The catch block in proxyChartmetricRequest.ts currently detects insufficient credits by string-parsing the error message which is fragile; update deductCredits to throw a dedicated error type (e.g., export class InsufficientCreditsError extends Error or return a discriminated result) and then change the catch in proxyChartmetricRequest (and any other consumers) to detect that case via instanceof InsufficientCreditsError (or by checking the discriminant) and return the 402 NextResponse.json with getCorsHeaders(); otherwise rethrow or handle other errors normally.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@app/api/chartmetric/`[...path]/route.ts:
- Around line 4-15: Update the JSDoc comments for both the GET and POST handlers
in route.ts to reflect the actual credit cost (change "Deducts 1 credit per
call" to "Deducts 5 credits per call") so they match the implementation in
proxyChartmetricRequest; locate the top-of-file comment blocks above the GET
handler and the POST handler and edit the line mentioning credits to state 5
credits instead of 1.
In `@lib/chartmetric/proxyChartmetricRequest.ts`:
- Around line 36-52: Credits are being deducted before the Chartmetric upstream
call (deductCredits is called pre-request) which charges users even if the
upstream request fails; to fix, move the deductCredits call to after a
successful upstream response (i.e., call deductCredits only when the Chartmetric
request returns a 2xx and data is returned) or, if you prefer to keep
pre-charging, implement a refund path by calling a refund function (e.g.,
refundCredits or addCredits with creditsToRefund: 5) inside the
catch/failed-response handling so failed Chartmetric requests trigger a refund,
and update the NextResponse.json error branches to invoke that refund call
before returning (refer to deductCredits, the upstream Chartmetric request
handler, and existing error response paths using NextResponse.json and
getCorsHeaders).
---
Nitpick comments:
In `@app/api/chartmetric/`[...path]/route.ts:
- Around line 16-34: Add an OPTIONS handler to this route to respond to CORS
preflight requests: export an async function OPTIONS() that returns a 204
NextResponse with headers from getCorsHeaders(), and import getCorsHeaders and
NextResponse at the top; place it alongside the existing GET/POST handlers
(which call proxyChartmetricRequest) so browser clients receive proper CORS
preflight responses.
In `@lib/chartmetric/proxyChartmetricRequest.ts`:
- Around line 92-97: The code currently calls chartmetricResponse.json() which
will throw on non-JSON responses; update proxyChartmetricRequest to first
inspect chartmetricResponse.headers.get('content-type') (or attempt a safe JSON
parse) and only call .json() when content-type includes 'application/json';
otherwise call chartmetricResponse.text() and return that body (or a structured
fallback error) via NextResponse.json/NextResponse.text with the original
chartmetricResponse.status and getCorsHeaders(); ensure chartmetricResponse and
getCorsHeaders are the referenced symbols used and keep existing status/headers
behavior for all branches.
- Around line 38-46: The catch block in proxyChartmetricRequest.ts currently
detects insufficient credits by string-parsing the error message which is
fragile; update deductCredits to throw a dedicated error type (e.g., export
class InsufficientCreditsError extends Error or return a discriminated result)
and then change the catch in proxyChartmetricRequest (and any other consumers)
to detect that case via instanceof InsufficientCreditsError (or by checking the
discriminant) and return the 402 NextResponse.json with getCorsHeaders();
otherwise rethrow or handle other errors normally.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: ffa7e8e1-ca9c-433e-bbae-b30f6fc3e14a
⛔ Files ignored due to path filters (1)
lib/chartmetric/__tests__/proxyChartmetricRequest.test.tsis excluded by!**/*.test.*,!**/__tests__/**and included bylib/**
📒 Files selected for processing (3)
app/api/chartmetric/[...path]/route.tslib/chartmetric/getChartmetricToken.tslib/chartmetric/proxyChartmetricRequest.ts
| /** | ||
| * GET /api/chartmetric/[...path] | ||
| * | ||
| * Proxies GET requests to the Chartmetric API on behalf of an authenticated account. | ||
| * Deducts 1 credit per call. | ||
| * | ||
| * @param request - Incoming API request. | ||
| * @param context - Route context containing the Chartmetric path segments. | ||
| * @param context.params - Route params with Chartmetric path segments. | ||
| * @param context.params.path - Array of path segments to forward to Chartmetric. | ||
| * @returns The Chartmetric API response. | ||
| */ |
There was a problem hiding this comment.
JSDoc says 1 credit but implementation deducts 5 credits.
The documentation states "Deducts 1 credit per call" (lines 8 and 24), but proxyChartmetricRequest actually deducts 5 credits. This inconsistency will mislead API consumers.
📝 Update JSDoc to reflect actual credit cost
/**
* GET /api/chartmetric/[...path]
*
* Proxies GET requests to the Chartmetric API on behalf of an authenticated account.
- * Deducts 1 credit per call.
+ * Deducts 5 credits per call.
*Apply the same fix to the POST handler's JSDoc (line 24).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@app/api/chartmetric/`[...path]/route.ts around lines 4 - 15, Update the JSDoc
comments for both the GET and POST handlers in route.ts to reflect the actual
credit cost (change "Deducts 1 credit per call" to "Deducts 5 credits per call")
so they match the implementation in proxyChartmetricRequest; locate the
top-of-file comment blocks above the GET handler and the POST handler and edit
the line mentioning credits to state 5 credits instead of 1.
| try { | ||
| await deductCredits({ accountId, creditsToDeduct: 5 }); | ||
| } catch (err) { | ||
| const message = err instanceof Error ? err.message : String(err); | ||
|
|
||
| if (message.toLowerCase().includes("insufficient credits")) { | ||
| return NextResponse.json( | ||
| { status: "error", error: "Insufficient credits for Chartmetric API call" }, | ||
| { status: 402, headers: getCorsHeaders() }, | ||
| ); | ||
| } | ||
|
|
||
| return NextResponse.json( | ||
| { status: "error", error: "Failed to deduct credits" }, | ||
| { status: 500, headers: getCorsHeaders() }, | ||
| ); | ||
| } |
There was a problem hiding this comment.
Credits deducted before confirming upstream success.
Currently, credits are deducted prior to the Chartmetric API call. If the upstream request fails (network error, 5xx, rate limit, etc.), the caller loses 5 credits without receiving data. This could lead to user frustration.
Consider one of these approaches:
- Deduct after success: Move credit deduction after a successful upstream response (risk: user could abort before deduction).
- Refund on failure: If the upstream call fails, issue a credit refund.
- Accept as-is: Document this behavior clearly so users understand they're charged per attempt, not per successful response.
If the current behavior is intentional (e.g., to cover costs of failed calls), it's worth documenting in the JSDoc or user-facing API docs.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@lib/chartmetric/proxyChartmetricRequest.ts` around lines 36 - 52, Credits are
being deducted before the Chartmetric upstream call (deductCredits is called
pre-request) which charges users even if the upstream request fails; to fix,
move the deductCredits call to after a successful upstream response (i.e., call
deductCredits only when the Chartmetric request returns a 2xx and data is
returned) or, if you prefer to keep pre-charging, implement a refund path by
calling a refund function (e.g., refundCredits or addCredits with
creditsToRefund: 5) inside the catch/failed-response handling so failed
Chartmetric requests trigger a refund, and update the NextResponse.json error
branches to invoke that refund call before returning (refer to deductCredits,
the upstream Chartmetric request handler, and existing error response paths
using NextResponse.json and getCorsHeaders).
…patibility Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (1)
app/api/chartmetric/[...path]/route.ts (1)
8-8:⚠️ Potential issue | 🟡 MinorJSDoc credit cost is inconsistent with actual deduction.
Line 8 and Line 25 say 1 credit, but the proxy implementation deducts 5 credits. Please align both handler docs with runtime behavior.
📝 Suggested doc fix
- * Deducts 1 credit per call. + * Deducts 5 credits per call.Also applies to: 25-25
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@app/api/chartmetric/`[...path]/route.ts at line 8, The JSDoc in app/api/chartmetric/[...path]/route.ts incorrectly states "Deducts 1 credit per call" while the proxy implementation deducts 5 credits; update the handler's JSDoc comments (the top-of-file comment and the inline comment at line 25) to state "Deducts 5 credits per call" so the documentation matches the runtime behavior of the route handler in route.ts.
🧹 Nitpick comments (1)
app/api/chartmetric/[...path]/route.ts (1)
16-19: Extract shared GET/POST proxy flow into a single helper.Line 16-19 and Line 33-36 duplicate the same params-await + proxy call path. A shared internal handler keeps this route simpler and easier to evolve.
As per coding guidelines: "
**/*.{ts,tsx}: Extract shared logic into reusable utilities following Don't Repeat Yourself (DRY) principle."Also applies to: 33-36
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@app/api/chartmetric/`[...path]/route.ts around lines 16 - 19, Extract the duplicated params-await + proxy call into a small internal helper (e.g., handleChartmetricProxy) and have both exported GET and POST call it; specifically, move the logic that awaits context.params and calls proxyChartmetricRequest(request, params) into a single function (referencing GET, POST, context.params, and proxyChartmetricRequest) and replace the duplicated bodies in GET and POST with a call to that helper to keep the route DRY and easier to evolve.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@app/api/chartmetric/`[...path]/route.ts:
- Around line 16-18: The route handler GET is relying only on TypeScript types
for context.params (a runtime input) before calling proxyChartmetricRequest; add
a Zod schema (e.g., pathSchema = z.object({ path: z.array(z.string()) })) and a
validate helper that uses schema.safeParse to validate await context.params, and
if validation fails return a NextResponse with status 400 and a descriptive
error; once validated pass the parsed value to proxyChartmetricRequest (repeat
same validation pattern for the other handler around lines 33-35) so all runtime
route params are validated before proxying.
---
Duplicate comments:
In `@app/api/chartmetric/`[...path]/route.ts:
- Line 8: The JSDoc in app/api/chartmetric/[...path]/route.ts incorrectly states
"Deducts 1 credit per call" while the proxy implementation deducts 5 credits;
update the handler's JSDoc comments (the top-of-file comment and the inline
comment at line 25) to state "Deducts 5 credits per call" so the documentation
matches the runtime behavior of the route handler in route.ts.
---
Nitpick comments:
In `@app/api/chartmetric/`[...path]/route.ts:
- Around line 16-19: Extract the duplicated params-await + proxy call into a
small internal helper (e.g., handleChartmetricProxy) and have both exported GET
and POST call it; specifically, move the logic that awaits context.params and
calls proxyChartmetricRequest(request, params) into a single function
(referencing GET, POST, context.params, and proxyChartmetricRequest) and replace
the duplicated bodies in GET and POST with a call to that helper to keep the
route DRY and easier to evolve.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 72e5ab29-cc6d-47bc-8be6-b19b89c7e43d
📒 Files selected for processing (1)
app/api/chartmetric/[...path]/route.ts
| export async function GET(request: NextRequest, context: { params: Promise<{ path: string[] }> }) { | ||
| const params = await context.params; | ||
| return proxyChartmetricRequest(request, params); |
There was a problem hiding this comment.
Add runtime Zod validation for route params before proxying.
Line 16 and Line 33 only use TypeScript typing; context.params is still unvalidated runtime input. Add a validate function (Zod + safeParse) and return 400 for invalid path payloads.
✅ Proposed validation pattern
-import { type NextRequest } from "next/server";
+import { type NextRequest, NextResponse } from "next/server";
+import { z } from "zod";
import { proxyChartmetricRequest } from "@/lib/chartmetric/proxyChartmetricRequest";
+
+const chartmetricParamsSchema = z.object({
+ path: z.array(z.string().min(1)).min(1),
+});
+
+function validateChartmetricParams(input: unknown) {
+ const parsed = chartmetricParamsSchema.safeParse(input);
+ if (!parsed.success) {
+ return NextResponse.json(
+ { status: "error", error: "Invalid Chartmetric path" },
+ { status: 400 },
+ );
+ }
+ return parsed.data;
+}
@@
export async function GET(request: NextRequest, context: { params: Promise<{ path: string[] }> }) {
- const params = await context.params;
- return proxyChartmetricRequest(request, params);
+ const validated = validateChartmetricParams(await context.params);
+ if (validated instanceof NextResponse) return validated;
+ return proxyChartmetricRequest(request, validated);
}
@@
export async function POST(request: NextRequest, context: { params: Promise<{ path: string[] }> }) {
- const params = await context.params;
- return proxyChartmetricRequest(request, params);
+ const validated = validateChartmetricParams(await context.params);
+ if (validated instanceof NextResponse) return validated;
+ return proxyChartmetricRequest(request, validated);
}As per coding guidelines: "All API endpoints should use a validate function for input parsing using Zod for schema validation."
Also applies to: 33-35
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@app/api/chartmetric/`[...path]/route.ts around lines 16 - 18, The route
handler GET is relying only on TypeScript types for context.params (a runtime
input) before calling proxyChartmetricRequest; add a Zod schema (e.g.,
pathSchema = z.object({ path: z.array(z.string()) })) and a validate helper that
uses schema.safeParse to validate await context.params, and if validation fails
return a NextResponse with status 400 and a descriptive error; once validated
pass the parsed value to proxyChartmetricRequest (repeat same validation pattern
for the other handler around lines 33-35) so all runtime route params are
validated before proxying.
Summary
GET/POST /api/chartmetric/[...path]proxy that forwards requests to ChartmetricvalidateAuthContext(API key or Bearer token)deductCreditsCHARTMETRIC_REFRESH_TOKENfor short-lived access token (key never exposed to clients)lib/chartmetric/getChartmetricToken.tsandlib/chartmetric/proxyChartmetricRequest.tsTest plan
pnpm test lib/chartmetric— all 5 tests passCHARTMETRIC_REFRESH_TOKENenv var set in Vercel/production🤖 Generated with Claude Code
Summary by CodeRabbit