Skip to content

Conversation

@92jpascal
Copy link

@92jpascal 92jpascal commented Oct 13, 2025

User description

…move "removal 1" comments


PR Type

Enhancement


Description

  • Refactored streaming model output to run researcher and query suggester concurrently

  • Added concurrency utilities for task management and token-based gating

  • Modified answer section rendering to display immediately without waiting for first token

  • Injected stream handling into search tool for better control


Diagram Walkthrough

flowchart LR
  A["Concurrency Utils"] --> B["Stage 1: Researcher"]
  A --> C["Stage 2: Query Suggester"]
  B --> D["Answer Section"]
  C --> D
  D --> E["Stream Output"]
Loading

File Walkthrough

Relevant files
Enhancement
concurrency.ts
Add concurrency control utilities for async operations     

lib/utils/concurrency.ts

  • Added runWithConcurrencyLimit function for parallel task execution
    with limits
  • Implemented withBudget for time-constrained async operations with
    abort support
  • Created tokenGate utility for threshold-based polling and execution
+63/-0   
actions.tsx
Refactor to concurrent stage-based execution model             

app/actions.tsx

  • Replaced sequential while loop with concurrent stage execution using
    Promise.allSettled
  • Split researcher (stage1) and query suggester (stage2) into parallel
    async operations
  • Added abort controller for cancellation support
  • Removed tool output processing from main loop (marked for removal)
+21/-4   
researcher.tsx
Optimize answer section rendering for immediate display   

lib/agents/researcher.tsx

  • Moved answer section append before stream processing to display
    immediately
  • Removed conditional rendering logic that waited for first text delta
  • Added streamText.done() call to properly close the stream
  • Removed spinner update and tool-result conditional UI updates
+11/-3   
search.tsx
Support stream injection in search tool                                   

lib/agents/tools/search.tsx

  • Added optional injectedStream parameter to search tool execute
    function
  • Modified to use injected stream or create new one as fallback
+10/-2   

Summary by CodeRabbit

  • New Features
    • Concurrent background research and suggestion generation for faster insights.
    • Unified live search result streaming for smoother updates.
  • UX Improvements
    • Answer section appears immediately and updates continuously.
    • Progress feedback retained during processing.
  • Performance
    • Parallel execution replaces blocking loops to reduce wait times.
    • Concurrency and time-budgeting improve responsiveness under load.
  • Stability
    • Improved error handling and cancellation readiness to avoid stalls.

@vercel
Copy link

vercel bot commented Oct 13, 2025

@92jpascal is attempting to deploy a commit to the QCX-MAIN Team on Vercel.

A member of the Team first needs to authorize it.

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@qodo-merge-pro
Copy link
Contributor

PR Compliance Guide 🔍

Below is a summary of compliance checks for this PR:

Security Compliance
Unstoppable polling timer

Description: The polling loop in tokenGate uses setInterval without an abort/cancel mechanism, risking
unbounded background timers if the threshold is never reached, which can degrade
performance or leak timers.
concurrency.ts [52-60]

Referred Code
await new Promise<void>((resolve) => {
  const id = setInterval(async () => {
    if (start() >= threshold) {
      clearInterval(id);
      resolve();
    }
  }, 50);
});
await fn();
Unaborted timed-out task

Description: withBudget races task with a timeout but does not abort the task on timeout, which can
leave long-running operations executing in the background and potentially causing resource
exhaustion.
concurrency.ts [24-44]

Referred Code
export async function withBudget<T>(options: { maxMs: number; signal?: AbortSignal }, task: () => Promise<T>): Promise<T> {
  const { maxMs, signal } = options;
  let timeoutId: ReturnType<typeof setTimeout> | null = null;

  const timeoutPromise = new Promise<never>((_, reject) => {
    timeoutId = setTimeout(() => reject(new Error('budget_exceeded')), maxMs);
  });

  if (signal) {
    if (signal.aborted) throw new Error('aborted');
    signal.addEventListener('abort', () => {
      if (timeoutId) clearTimeout(timeoutId);
    }, { once: true });
  }

  try {
    return await Promise.race([task(), timeoutPromise]);
  } finally {
    if (timeoutId) clearTimeout(timeoutId);
  }
}
Ticket Compliance
🎫 No ticket provided
  • Create ticket/issue
Codebase Duplication Compliance
Codebase context is not defined

Follow the guide to enable codebase context checks.

Custom Compliance
No custom compliance provided

Follow the guide to enable custom compliance check.

Compliance status legend 🟢 - Fully Compliant
🟡 - Partial Compliant
🔴 - Not Compliant
⚪ - Requires Further Human Verification
🏷️ - Compliance label

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Oct 13, 2025

Walkthrough

Parallelizes answer generation by introducing two concurrent stages and awaiting both via Promise.allSettled. Updates researcher agent to accept a dynamic system prompt and adjust UI streaming behavior. Allows search tool to accept an injected stream. Adds new concurrency utilities for limiting tasks, budgeting with timeouts, and token gating.

Changes

Cohort / File(s) Summary of changes
Parallelized stages and control flow
app/actions.tsx
Replaces sequential loop with two concurrent producers (researcher and querySuggestor), awaited via Promise.allSettled; wraps stage2 in try/catch; prepares AbortController; defers subsequent API/length checks until after both stages complete.
Researcher agent signature and UI streaming
lib/agents/researcher.tsx
Adds dynamicSystemPrompt param; removes mcp param; appends answer section immediately; removes first-delta gating and conditional append logic; calls streamText.done() before return.
Search tool stream injection
lib/agents/tools/search.tsx
Updates execute signature to accept optional injectedStream; uses it when provided, otherwise creates a new stream; preserves existing control flow and result handling.
Concurrency utilities
lib/utils/concurrency.ts
Adds runWithConcurrencyLimit (bounded parallelism), withBudget (timeout with AbortSignal support), and tokenGate (polling gate) helpers; exports all three.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor U as User
  participant A as app/actions.tsx
  participant R as Researcher
  participant Q as QuerySuggestor
  participant S as Subsequent Logic

  U->>A: Trigger action
  note over A: Create AbortController<br/>Start two stages in parallel
  par Stage 1
    A->>R: researcher(dynamicSystemPrompt, streams, messages,...)
    R-->>A: Streamed text + done()
  and Stage 2
    A->>Q: querySuggestor(...)
    Q-->>A: Suggestions or error (caught)
  end
  A->>A: Promise.allSettled([R, Q])
  alt Both settled
    A->>S: Run follow-up (API outcomes, length checks)
    S-->>U: Final response
  else Abort/timeout
    A-->>U: Terminate/handle error
  end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

  • prompting & MCP #304 — Adjusts lib/agents/researcher.tsx public signature to add dynamicSystemPrompt and remove mcp, matching this PR’s API change.

Suggested reviewers

  • CharlieHelps

Poem

I thump my paws—two stages race,
Streams now dance in parallel space.
Researcher hums, suggestions chime,
Concurrency keeps perfect time.
With budgets set and tokens wait—
A hoppy merge, a snappy fate! 🐇✨

Pre-merge checks and finishing touches

❌ Failed checks (1 warning, 1 inconclusive)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 14.29% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
Title Check ❓ Inconclusive The current title is truncated and too generic to clearly convey the main refactoring that parallelizes research and query stages and introduces concurrency utilities for smoother streaming. Please update the title to a concise, single sentence that highlights the core change, for example “Parallelize researcher and querySuggestor stages and add concurrency utilities for smooth streaming output.”
✅ Passed checks (1 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@qodo-merge-pro
Copy link
Contributor

PR Code Suggestions ✨

Explore these optional code suggestions:

CategorySuggestion                                                                                                                                    Impact
Possible issue
Properly handle abort signal and leaks

Fix the withBudget function to correctly handle AbortSignal for cancellation and
prevent a memory leak by cleaning up the event listener.

lib/utils/concurrency.ts [24-44]

 export async function withBudget<T>(options: { maxMs: number; signal?: AbortSignal }, task: () => Promise<T>): Promise<T> {
   const { maxMs, signal } = options;
   let timeoutId: ReturnType<typeof setTimeout> | null = null;
 
   const timeoutPromise = new Promise<never>((_, reject) => {
     timeoutId = setTimeout(() => reject(new Error('budget_exceeded')), maxMs);
   });
 
+  const promises: Promise<any>[] = [task(), timeoutPromise];
+  let abortListener: (() => void) | null = null;
+
   if (signal) {
     if (signal.aborted) throw new Error('aborted');
-    signal.addEventListener('abort', () => {
-      if (timeoutId) clearTimeout(timeoutId);
-    }, { once: true });
+    const abortPromise = new Promise<never>((_, reject) => {
+      abortListener = () => reject(new Error('aborted'));
+      signal.addEventListener('abort', abortListener);
+    });
+    promises.push(abortPromise);
   }
 
   try {
-    return await Promise.race([task(), timeoutPromise]);
+    return await Promise.race(promises);
   } finally {
     if (timeoutId) clearTimeout(timeoutId);
+    if (signal && abortListener) {
+      signal.removeEventListener('abort', abortListener);
+    }
   }
 }
  • Apply / Chat
Suggestion importance[1-10]: 9

__

Why: The suggestion identifies a critical flaw where the AbortSignal is not properly handled, preventing cancellation, and also points out a memory leak, proposing a correct and robust fix.

High
High-level
Remove unused concurrency utility functions

The new lib/utils/concurrency.ts file is not used within the PR and should be
removed. The PR's concurrency is already handled by Promise.allSettled, making
the new utilities dead code.

Examples:

lib/utils/concurrency.ts [1-63]
export async function runWithConcurrencyLimit<T>(limit: number, tasks: Array<() => Promise<T>>): Promise<PromiseSettledResult<T>[]> {
  if (limit <= 0) limit = 1;
  const results: PromiseSettledResult<T>[] = [];
  let index = 0;

  async function worker() {
    while (index < tasks.length) {
      const current = index++;
      const task = tasks[current];
      try {

 ... (clipped 53 lines)

Solution Walkthrough:

Before:

// lib/utils/concurrency.ts (new file)
export async function runWithConcurrencyLimit(...) { ... }
export async function withBudget(...) { ... }
export async function tokenGate(...) { ... }

// app/actions.tsx
// ... no import from 'lib/utils/concurrency.ts'
const stage1 = (async () => { /* researcher */ })();
const stage2 = (async () => { /* querySuggestor */ })();
await Promise.allSettled([stage1, stage2]);

After:

// lib/utils/concurrency.ts (file removed)

// app/actions.tsx
// ... (no change here)
const stage1 = (async () => { /* researcher */ })();
const stage2 = (async () => { /* querySuggestor */ })();
await Promise.allSettled([stage1, stage2]);
Suggestion importance[1-10]: 7

__

Why: The suggestion correctly identifies that the new file lib/utils/concurrency.ts is entirely unused in the PR, representing dead code that should be removed.

Medium
General
Log errors in the catch block

Add error logging to the catch block for stage2 to aid in debugging, as it
currently ignores errors from querySuggestor silently.

app/actions.tsx [360-365]

 // Stage 2: cheap producer - related suggestions
 const stage2 = (async () => {
   try {
     await querySuggestor(uiStream, messages)
-  } catch {}
+  } catch (e) {
+    console.error('Query suggestion stage failed:', e);
+  }
 })()
  • Apply / Chat
Suggestion importance[1-10]: 5

__

Why: The suggestion correctly identifies that silently ignoring errors is poor practice and proposes adding logging, which improves the code's debuggability and maintainability.

Low
  • More

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (5)
app/actions.tsx (4)

370-389: Writer fallback should use the locally mutated messages (with tool calls/results), not aiState.get().messages.

Using aiState drops tool context (you removed in-loop AI state updates). Use the local messages (mutated by researcher) so writer has full context.

-      const modifiedMessages = aiState
-        .get()
-        .messages.map(msg =>
-          msg.role === 'tool'
-            ? {
-                ...msg,
-                role: 'assistant',
-                content: JSON.stringify(msg.content),
-                type: 'tool'
-              }
-            : msg
-        ) as CoreMessage[]
-      const latestMessages = modifiedMessages.slice(maxMessages * -1)
+      // Use the locally built conversation including tool calls/results from researcher
+      const latestMessages = (messages as CoreMessage[]).slice(maxMessages * -1)
       answer = await writer(
         currentSystemPrompt,
         uiStream,
         streamText,
         latestMessages
       )

390-391: Avoid double-ending the same stream.

researcher() already calls streamText.done(). This extra call is redundant and can cause no-op warnings.

-    } else {
-      streamText.done()
-    }
+    }

339-359: Persist tool outputs to AI state once (post-stage) to retain UI on reload/share.

You removed real-time tool appends. Without persisting tool results, Search/Retrieve/Video sections won’t reconstruct from saved state. After both stages settle, flush toolOutputs into aiState.

     await Promise.allSettled([stage1, stage2])
+
+    // Persist tool outputs so UI can be reconstructed from AI state later.
+    if (toolOutputs.length > 0) {
+      for (const output of toolOutputs) {
+        aiState.update({
+          ...aiState.get(),
+          messages: [
+            ...aiState.get().messages,
+            {
+              id: groupeId,
+              role: 'tool',
+              content: JSON.stringify(output.result),
+              name: output.toolName,
+              type: 'tool'
+            }
+          ]
+        })
+      }
+    }

Also applies to: 367-368


294-295: Finish isGenerating with a boolean.

Be consistent: set to false.

-      isGenerating.done()
+      isGenerating.done(false)
lib/agents/tools/search.tsx (1)

49-53: Fix missing closing quote and complete the stream on errors to avoid a hanging section.

-      fullResponse += `\nAn error occurred while searching for "${query}.`
+      fullResponse += `\nAn error occurred while searching for "${query}".`
       uiStream.update(
         <Card className="p-4 mt-2 text-sm">
           {`An error occurred while searching for "${query}".`}
         </Card>
       )
+      // Complete the injected or local stream so UI doesn't hang
+      try { streamResults.done(JSON.stringify({ error: true })) } catch {}
       return searchResult
📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 24da001 and c37a430.

📒 Files selected for processing (4)
  • app/actions.tsx (3 hunks)
  • lib/agents/researcher.tsx (4 hunks)
  • lib/agents/tools/search.tsx (1 hunks)
  • lib/utils/concurrency.ts (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (2)
app/actions.tsx (1)
lib/agents/query-suggestor.tsx (1)
  • querySuggestor (8-50)
lib/agents/researcher.tsx (4)
lib/agents/writer.tsx (1)
  • writer (7-51)
lib/agents/inquire.tsx (1)
  • inquire (12-54)
lib/agents/tools/index.tsx (1)
  • ToolProps (7-11)
lib/agents/tools/geospatial.tsx (1)
  • uiStream (150-337)
🔇 Additional comments (1)
app/actions.tsx (1)

140-140: Fix Markdown link formatting (extra space breaks the link).

[suggest_minor_issue]

-      : `QCX-Terra is a model garden of pixel level precision geospatial foundational models for efficient land feature predictions from satellite imagery. Available for our Pro and Enterprise customers. [QCX Pricing] (https://www.queue.cx/#pricing)`;
+      : `QCX-Terra is a model garden of pixel level precision geospatial foundational models for efficient land feature predictions from satellite imagery. Available for our Pro and Enterprise customers. [QCX Pricing](https://www.queue.cx/#pricing)`;

) {
*/
// Split and stage producers
const abortController = new AbortController()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Remove or wire AbortController.

Declared but unused. Either pass its signal into a budget/timeout helper or remove it.

-    const abortController = new AbortController()
+    // If cancellation will be supported, pass AbortSignal into withBudget/agents; otherwise omit.
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const abortController = new AbortController()
// If cancellation will be supported, pass AbortSignal into withBudget/agents; otherwise omit.
🤖 Prompt for AI Agents
In app/actions.tsx around line 325, an AbortController is declared but never
used; either remove the unused AbortController declaration or wire it into the
async operation that needs cancellation by passing abortController.signal to the
fetch/budget/timeout helper (or storing it to call abort() where appropriate).
If you keep it, thread the signal into the request/timeouter function and ensure
you handle AbortError in the catch; otherwise delete the AbortController
declaration to eliminate the unused variable.

Comment on lines +361 to +366
const stage2 = (async () => {
try {
await querySuggestor(uiStream, messages)
} catch {}
})()

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

Don’t run querySuggestor twice; capture stage2 result and reuse to avoid duplicate “Related” UI and extra API calls.

Stage2 already runs querySuggestor, but it’s called again later. Capture the stage2 result and only call again if stage2 failed.

-    const stage2 = (async () => {
-      try {
-        await querySuggestor(uiStream, messages)
-      } catch {}
-    })()
+    let relatedFromStage2: RelatedQueries | null = null
+    const stage2 = (async () => {
+      try {
+        relatedFromStage2 = await querySuggestor(uiStream, messages)
+      } catch (err) {
+        console.warn('querySuggestor failed (stage2):', err)
+      }
+    })()

     await Promise.allSettled([stage1, stage2])

-    if (!errorOccurred) {
-      const relatedQueries = await querySuggestor(uiStream, messages)
+    if (!errorOccurred) {
+      // If stage2 already produced related queries (and UI), reuse them; otherwise fetch once.
+      const relatedQueries = relatedFromStage2 ?? await querySuggestor(uiStream, messages)
       uiStream.append(
         <Section title="Follow-up">
           <FollowupPanel />
         </Section>
       )
@@
           {
             id: groupeId,
             role: 'assistant',
             content: JSON.stringify(relatedQueries),
             type: 'related'
           },

Also applies to: 393-427

🤖 Prompt for AI Agents
In app/actions.tsx around lines 361-366 (and similarly 393-427), the current
code immediately invokes an async IIFE that calls querySuggestor and then later
calls querySuggestor again, causing duplicate "Related" UI and extra API calls;
change the pattern to capture the stage2 promise (e.g., assign the invoked
promise to a variable) and reuse its result where querySuggestor is called
later: await the captured stage2 promise when you need its outcome, and only
call querySuggestor again as a fallback if that awaited promise rejected or
returned an explicit failure. Ensure you preserve existing try/catch semantics
around the initial invocation and the later consumption so duplicate requests
are avoided.

Comment on lines +362 to +365
try {
await querySuggestor(uiStream, messages)
} catch {}
})()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Don’t swallow stage2 errors silently.

Log at least a warning so failures aren’t invisible.

-      try {
-        await querySuggestor(uiStream, messages)
-      } catch {}
+      try {
+        await querySuggestor(uiStream, messages)
+      } catch (err) {
+        console.warn('querySuggestor failed:', err)
+      }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
try {
await querySuggestor(uiStream, messages)
} catch {}
})()
try {
await querySuggestor(uiStream, messages)
} catch (err) {
console.warn('querySuggestor failed:', err)
}
})()
🤖 Prompt for AI Agents
In app/actions.tsx around lines 362 to 365, the catch block after await
querySuggestor(uiStream, messages) is silently swallowing errors; change it to
catch the error and log at least a warning with the error details (e.g.,
processLogger.warn or console.warn with a contextual message and the error) so
stage2 failures are visible for debugging; preserve existing control flow after
logging.

Comment on lines +83 to 87
// removal 1: // Remove the spinner
// removal 1: // uiStream.update(null)
// Append the answer section immediately to avoid gating on first token
uiStream.append(answerSection)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Clear the spinner before appending the answer section to prevent a stuck spinner.

The spinner is set in app/actions.tsx via uiStream.update(). Call update(null) here to remove it once streaming begins.

-  // Append the answer section immediately to avoid gating on first token
-  uiStream.append(answerSection)
+  // Replace spinner with the answer section once streaming starts
+  uiStream.update(null)
+  uiStream.append(answerSection)
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// removal 1: // Remove the spinner
// removal 1: // uiStream.update(null)
// Append the answer section immediately to avoid gating on first token
uiStream.append(answerSection)
// removal 1: // Remove the spinner
// removal 1: // uiStream.update(null)
// Replace spinner with the answer section once streaming starts
uiStream.update(null)
uiStream.append(answerSection)
🤖 Prompt for AI Agents
In lib/agents/researcher.tsx around lines 83 to 87, the spinner set earlier via
uiStream.update(<Spinner />) is not being cleared before appending the answer
section which can leave the spinner visible; call uiStream.update(null)
immediately before uiStream.append(answerSection) to remove the spinner, then
append the answerSection so streaming begins without a stuck spinner.

Comment on lines +32 to +44
if (signal) {
if (signal.aborted) throw new Error('aborted');
signal.addEventListener('abort', () => {
if (timeoutId) clearTimeout(timeoutId);
}, { once: true });
}

try {
return await Promise.race([task(), timeoutPromise]);
} finally {
if (timeoutId) clearTimeout(timeoutId);
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

withBudget can hang on AbortSignal; reject on abort and clear timeout.

Currently abort clears the timeout but doesn’t reject, potentially hanging forever. Include an abortPromise in the race.

   if (signal) {
-    if (signal.aborted) throw new Error('aborted');
-    signal.addEventListener('abort', () => {
-      if (timeoutId) clearTimeout(timeoutId);
-    }, { once: true });
+    if (signal.aborted) throw new DOMException('Aborted', 'AbortError');
+    // Ensure we reject promptly on abort and clear timeout
+    // Note: listeners are one-shot
+    var abortPromise = new Promise<never>((_, reject) => {
+      signal.addEventListener(
+        'abort',
+        () => {
+          if (timeoutId) clearTimeout(timeoutId);
+          reject(new DOMException('Aborted', 'AbortError'));
+        },
+        { once: true }
+      )
+    })
   }
 
   try {
-    return await Promise.race([task(), timeoutPromise]);
+    // Include abortPromise in the race if provided
+    return await Promise.race(
+      typeof abortPromise !== 'undefined'
+        ? [task(), timeoutPromise, abortPromise]
+        : [task(), timeoutPromise]
+    );
   } finally {
     if (timeoutId) clearTimeout(timeoutId);
   }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if (signal) {
if (signal.aborted) throw new Error('aborted');
signal.addEventListener('abort', () => {
if (timeoutId) clearTimeout(timeoutId);
}, { once: true });
}
try {
return await Promise.race([task(), timeoutPromise]);
} finally {
if (timeoutId) clearTimeout(timeoutId);
}
}
if (signal) {
if (signal.aborted) throw new DOMException('Aborted', 'AbortError');
// Ensure we reject promptly on abort and clear timeout
// Note: listeners are one-shot
var abortPromise = new Promise<never>((_, reject) => {
signal.addEventListener(
'abort',
() => {
if (timeoutId) clearTimeout(timeoutId);
reject(new DOMException('Aborted', 'AbortError'));
},
{ once: true }
);
});
}
try {
// Include abortPromise in the race if provided
return await Promise.race(
typeof abortPromise !== 'undefined'
? [task(), timeoutPromise, abortPromise]
: [task(), timeoutPromise]
);
} finally {
if (timeoutId) clearTimeout(timeoutId);
}
}
🤖 Prompt for AI Agents
In lib/utils/concurrency.ts around lines 32 to 44, the current abort handler
only clears the timeout and doesn't reject, which can cause withBudget to hang;
add an abortPromise that rejects when signal.aborted (or on 'abort' event) and
include it in the Promise.race alongside task() and timeoutPromise, ensure the
abort listener also clears the timeout and rejects with an appropriate
AbortError, and finally keep the existing finally cleanup to clear the timeout
and remove the abort listener so no handlers leak.

Comment on lines +46 to +61
export async function tokenGate(start: () => number, threshold: number, fn: () => Promise<void>): Promise<void> {
if (start() >= threshold) {
await fn();
return;
}
// Poll lightly until threshold reached
await new Promise<void>((resolve) => {
const id = setInterval(async () => {
if (start() >= threshold) {
clearInterval(id);
resolve();
}
}, 50);
});
await fn();
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Optional: tokenGate can wait indefinitely; consider max wait or external signal.

To avoid unbounded waits, support a max wait or AbortSignal.

-export async function tokenGate(start: () => number, threshold: number, fn: () => Promise<void>): Promise<void> {
+export async function tokenGate(start: () => number, threshold: number, fn: () => Promise<void>, options?: { maxMs?: number; signal?: AbortSignal }): Promise<void> {
   if (start() >= threshold) {
     await fn();
     return;
   }
   // Poll lightly until threshold reached
-  await new Promise<void>((resolve) => {
+  await new Promise<void>((resolve, reject) => {
     const id = setInterval(async () => {
       if (start() >= threshold) {
         clearInterval(id);
         resolve();
       }
-    }, 50);
+    }, 50);
+    if (options?.signal) {
+      options.signal.addEventListener('abort', () => { clearInterval(id); reject(new DOMException('Aborted', 'AbortError')) }, { once: true })
+    }
+    if (options?.maxMs) {
+      setTimeout(() => { clearInterval(id); reject(new Error('token_gate_timeout')) }, options.maxMs)
+    }
   });
   await fn();
 }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
export async function tokenGate(start: () => number, threshold: number, fn: () => Promise<void>): Promise<void> {
if (start() >= threshold) {
await fn();
return;
}
// Poll lightly until threshold reached
await new Promise<void>((resolve) => {
const id = setInterval(async () => {
if (start() >= threshold) {
clearInterval(id);
resolve();
}
}, 50);
});
await fn();
}
export async function tokenGate(
start: () => number,
threshold: number,
fn: () => Promise<void>,
options?: { maxMs?: number; signal?: AbortSignal }
): Promise<void> {
if (start() >= threshold) {
await fn()
return
}
// Poll lightly until threshold reached
await new Promise<void>((resolve, reject) => {
const id = setInterval(async () => {
if (start() >= threshold) {
clearInterval(id)
resolve()
}
}, 50)
if (options?.signal) {
options.signal.addEventListener(
'abort',
() => {
clearInterval(id)
reject(new DOMException('Aborted', 'AbortError'))
},
{ once: true }
)
}
if (options?.maxMs) {
setTimeout(() => {
clearInterval(id)
reject(new Error('token_gate_timeout'))
}, options.maxMs)
}
})
await fn()
}
🤖 Prompt for AI Agents
In lib/utils/concurrency.ts around lines 46 to 61, tokenGate currently can block
forever waiting for the threshold; change its signature to accept an optional
maxWaitMs (number) and/or an AbortSignal, and update the wait logic to race the
interval check against a timeout and an abort signal: start the interval checker
as now, also start a timeout that clears the interval and rejects/returns if
maxWaitMs elapses (or treat as a resolved no-op depending on desired behavior),
and attach an abort listener that clears the interval and rejects immediately
when signalled; ensure all timers/listeners are cleaned up before calling fn or
returning so fn is only invoked when threshold reached and callers receive a
clear error/abort when waiting stops.

@vercel
Copy link

vercel bot commented Oct 13, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
qcx Ready Ready Preview Comment Oct 13, 2025 10:41am

@ngoiyaeric
Copy link
Collaborator

Thanks for the pull request @92jpascal
The follow up chat input is now missing, the stream is still not smooth.
I like the concurrency you've tried to implement, can you retain the order of the response -> related ui.
The application breaks for two or more subsequent inputs from related.
Can you go over these recommendations as well as @coderabbitai for a better pull request.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Oct 13, 2025

Tip

For best results, initiate chat on the files or code changes.

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Examine the key files changed in this PR to understand the flow
echo "=== app/actions.tsx ==="
cat app/actions.tsx | head -200

echo -e "\n=== lib/agents/researcher.tsx ==="
cat lib/agents/researcher.tsx | head -200

echo -e "\n=== lib/agents/tools/search.tsx ==="
cat lib/agents/tools/search.tsx | head -150

Length of output: 15905


🏁 Script executed:

#!/bin/bash
# Continue examining app/actions.tsx to see the parallel execution and followup handling
echo "=== app/actions.tsx (continuation - lines 200-400) ==="
cat app/actions.tsx | sed -n '200,400p'

echo -e "\n=== lib/utils/concurrency.ts ==="
cat lib/utils/concurrency.ts

echo -e "\n=== Look for FollowupPanel usage ==="
rg -n "FollowupPanel" --type=tsx --type=ts -C3

Length of output: 7049


🏁 Script executed:

#!/bin/bash
# Look for querySuggestor to understand what it does
echo "=== lib/agents/querySuggestor.tsx or .ts ==="
fd querySuggestor --exec cat {}

echo -e "\n=== Check for SearchRelated component usage ==="
rg -n "SearchRelated" app/actions.tsx -A5 -B5

echo -e "\n=== Look at the end of app/actions.tsx to see how aiState is updated ==="
cat app/actions.tsx | tail -150

Length of output: 6235


🏁 Script executed:

#!/bin/bash
# Look at more of the app/actions.tsx to see how aiState is updated after stages
echo "=== app/actions.tsx (lines 400-550) ==="
cat app/actions.tsx | sed -n '400,550p'

echo -e "\n=== Find querySuggestor in lib/agents directory ==="
fd -e ts -e tsx . lib/agents | xargs grep -l "querySuggestor" | head -5

echo -e "\n=== Find index.ts in lib/agents ==="
cat lib/agents/index.ts 2>/dev/null || cat lib/agents/index.tsx 2>/dev/null || echo "index file not found"

Length of output: 3627


🏁 Script executed:

#!/bin/bash
# Examine the query-suggestor implementation
echo "=== lib/agents/query-suggestor.tsx ==="
cat lib/agents/query-suggestor.tsx

echo -e "\n=== Check the flow around line 380-430 in app/actions.tsx ==="
cat app/actions.tsx | sed -n '360,430p'

echo -e "\n=== Search for where relatedQueries is used ==="
rg -n "relatedQueries" app/actions.tsx -B2 -A2

Length of output: 5149


@ngoiyaeric, thank you for the detailed feedback! I've analyzed the PR and identified several critical issues that explain the problems you're experiencing. Let me break them down:

🔴 Critical Issues Found

1. Duplicate querySuggestor Calls

The most significant issue is that querySuggestor is being called twice:

Line 362-366 (stage2 - parallel):

const stage2 = (async () => {
  try {
    await querySuggestor(uiStream, messages)
  } catch {}
})()

Line 394 (after stages complete):

const relatedQueries = await querySuggestor(uiStream, messages)

This causes:

  • The "Related" section to be appended to the UI stream twice
  • State corruption when handling subsequent related query clicks
  • The application breaking on multiple related inputs

Fix: Remove the querySuggestor call from stage2, OR remove the second call on line 394.

2. Broken UI Ordering (Response → Related)

The parallel execution with Promise.allSettled([stage1, stage2]) doesn't guarantee order. Since both stages append to uiStream:

  • stage2 (querySuggestor) can complete before stage1 (researcher)
  • This causes "Related" section to appear before "Response" section
  • Breaks the expected flow: Response → Related → Follow-up

Fix: Execute stages sequentially instead of in parallel, or use a coordination mechanism to ensure UI ordering.

3. Missing Follow-up Chat Input

Looking at the flow after line 394:

const relatedQueries = await querySuggestor(uiStream, messages)
uiStream.append(
  <Section title="Follow-up">
    <FollowupPanel />
  </Section>
)

The FollowupPanel is being appended, but since querySuggestor was already called in stage2 and appended its own UI, the state management gets confused.

4. Non-Smooth Streaming

In lib/agents/researcher.tsx, the answer section is now appended immediately (line 61):

uiStream.append(answerSection)

This happens before any content is available from the model, which can cause:

  • Empty sections to flash on screen
  • Jarring UI updates as content streams in
  • Not achieving the "smooth stream" goal

5. Unused Concurrency Utilities

The new lib/utils/concurrency.ts file provides excellent utilities (runWithConcurrencyLimit, withBudget, tokenGate), but none of them are being used in the actual implementation. The code just uses plain Promise.allSettled.

💡 Recommended Fixes

Option A: Sequential Execution (Simplest)

// Stage 1: Get the main response first
const { fullResponse, hasError, toolResponses } = await researcher(
  currentSystemPrompt,
  uiStream,
  streamText,
  messages,
  useSpecificAPI
)
answer = fullResponse
toolOutputs = toolResponses
errorOccurred = hasError

// Handle writer if needed
if (useSpecificAPI && answer.length === 0) {
  // ... existing writer logic
} else {
  streamText.done()
}

// Stage 2: Only now get related queries (ensures order)
if (!errorOccurred) {
  const relatedQueries = await querySuggestor(uiStream, messages)
  
  // Stage 3: Append followup panel
  uiStream.append(
    <Section title="Follow-up">
      <FollowupPanel />
    </Section>
  )
  
  // Update AI state
  aiState.done({
    ...aiState.get(),
    messages: [
      ...aiState.get().messages,
      {
        id: groupeId,
        role: 'assistant',
        content: answer,
        type: 'response'
      },
      {
        id: groupeId,
        role: 'assistant',
        content: JSON.stringify(relatedQueries),
        type: 'related'
      },
      {
        id: groupeId,
        role: 'assistant',
        content: 'followup',
        type: 'followup'
      }
    ]
  })
}

Option B: Coordinated Parallel Execution

If you want to keep parallelism for performance, you need to:

  1. Let querySuggestor return data without appending to UI
  2. Collect results from both stages
  3. Append UI elements in the correct order after both complete
const stage1Promise = researcher(
  currentSystemPrompt,
  uiStream,
  streamText,
  messages,
  useSpecificModel
)

// Start stage2 but don't let it append to UI yet
const stage2Promise = querySuggestor(uiStream, messages, { skipUIUpdate: true })

const [result1, result2] = await Promise.allSettled([stage1Promise, stage2Promise])

// Now append in order
if (result1.status === 'fulfilled') {
  // response UI already appended by researcher
  answer = result1.value.fullResponse
}

if (result2.status === 'fulfilled' && !errorOccurred) {
  // Manually append related queries in correct order
  const relatedQueries = result2.value
  uiStream.append(
    <Section title="Related" separator={true}>
      <SearchRelated relatedQueries={relatedQueries} />
    </Section>
  )
  
  uiStream.append(
    <Section title="Follow-up">
      <FollowupPanel />
    </Section>
  )
}

Fix for Researcher Smooth Streaming

In lib/agents/researcher.tsx, restore the gate on first token:

for await (const delta of result.fullStream) {
  switch (delta.type) {
    case 'text-delta':
      if (delta.textDelta) {
        // Gate on first token for smooth appearance
        if (fullResponse.length === 0 && delta.textDelta.length > 0) {
          uiStream.update(answerSection)
        }
        fullResponse += delta.textDelta
        streamText.update(fullResponse)
      }
      break
    // ...
  }
}

Would you like me to create detailed code examples for implementing any of these fixes?

@ngoiyaeric
Copy link
Collaborator

@codetabbitai can you generalize the methodology to enhance performance through concurrency across the entire codebase.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants