Skip to content

Fix cursor disappearing and implement Ollama streaming in interactive mode#28

Merged
joone merged 5 commits intomainfrom
copilot/fix-cursor-disappearing-ollama-streaming
Feb 16, 2026
Merged

Fix cursor disappearing and implement Ollama streaming in interactive mode#28
joone merged 5 commits intomainfrom
copilot/fix-cursor-disappearing-ollama-streaming

Conversation

Copy link
Copy Markdown
Contributor

Copilot AI commented Feb 16, 2026

Interactive mode had two issues: cursor disappeared after startup, and Ollama responses weren't streamed in real-time.

Changes

Cursor visibility fix

  • Removed process.stdin.setRawMode(true) and ANSI cursor manipulation codes from CommandLinePrompt
  • Readline's default cursor handling is sufficient and doesn't cause the cursor to disappear

Ollama streaming implementation

  • Implemented OllamaAPI.completionStream() using ollama-node's streamingGenerate callback API
  • Converts callback-based streaming to async iterator using queue pattern:
    • Chunks queue immediately on arrival, resolve pending promises if waiting
    • Prevents race conditions by queuing before resolving
    • Clears queue on error to avoid stale data
// Stream handler in loz.ts now includes ollama
if (api === "openai" || api === "github-copilot" || api === "ollama") {
  const stream = await this.llmAPI.completionStream(params);
  // ...
  if (api === "ollama") {
    for await (const data of stream) {
      const streamData = data.response || "";
      process.stdout.write(streamData);
    }
  }
}

The async iterator yields {response: chunk} objects compatible with existing streaming infrastructure used by OpenAI and GitHub Copilot handlers.


💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.

Copilot AI and others added 4 commits February 16, 2026 09:59
- Remove process.stdin.setRawMode(true) to fix cursor disappearing
- Remove manual cursor manipulation with ANSI codes
- Implement Ollama completionStream using streamingGenerate API
- Add Ollama streaming handling in runCompletion method

Co-authored-by: joone <1979160+joone@users.noreply.github.com>
- Refactor completionStream to use async iterator pattern
- Yield chunks immediately as they arrive from callback
- Use queue and promise-based approach for true streaming
- Addresses code review feedback

Co-authored-by: joone <1979160+joone@users.noreply.github.com>
- Queue chunks before resolving to prevent race conditions
- Clear chunk queue on error to avoid stale data
- Use break instead of return to ensure newline is written
- Improve consistency in null handling across streaming implementations

Co-authored-by: joone <1979160+joone@users.noreply.github.com>
- Rename resolveNext to pendingResolve for clarity
- Use chunks.splice(0) instead of chunks.length = 0
- Address code review style suggestions

Co-authored-by: joone <1979160+joone@users.noreply.github.com>
Copilot AI changed the title [WIP] Fix cursor disappearing and add streaming for Ollama in interactive mode Fix cursor disappearing and implement Ollama streaming in interactive mode Feb 16, 2026
Copilot AI requested a review from joone February 16, 2026 10:04
@joone joone marked this pull request as ready for review February 16, 2026 10:04
@joone joone merged commit 69f9c57 into main Feb 16, 2026
0 of 3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants