Skip to content

feat(dashboard): LLM unconfigured nudge banner + Ollama UX tweaks#237

Merged
melagiri merged 3 commits intomasterfrom
feature/ollama-dashboard-nudge
Mar 25, 2026
Merged

feat(dashboard): LLM unconfigured nudge banner + Ollama UX tweaks#237
melagiri merged 3 commits intomasterfrom
feature/ollama-dashboard-nudge

Conversation

@melagiri
Copy link
Copy Markdown
Owner

What

Adds a dismissible LlmNudgeBanner on the Insights and Patterns pages that appears when no LLM provider is configured. Also carries forward deferred UX tweaks from PR #236.

Why

Users who open the dashboard without an LLM configured see empty or perpetually-loading AI sections with no explanation. This gives them a clear, non-blocking path to unlock AI features — with Ollama surfaced as the zero-friction local option.

How

  • New component dashboard/src/components/LlmNudgeBanner.tsx: fully self-contained, uses useLlmConfig() hook to check server-side config. Returns null while loading (prevents flash), returns null if provider is set (won't show when configured). Dismissal stored in localStorage with key llm-nudge-dismissed-{context}.
  • InsightsPage: banner placed inside the scrollable content area, above the error/loading/empty conditional — always visible when unconfigured regardless of data state.
  • PatternsPage: banner placed at the top of the main return JSX, after early loading/error returns, before the header block.
  • CLI ollama-detect.ts: "using" → "configured", added next-step hint pointing to the dashboard Analyze button.
  • README: Ollama callout updated to npx @code-insights/cli, architecture diagram updated to (API key or Ollama).

Schema Impact

  • SQLite schema changed: no
  • Types changed: no
  • Server API changed: no
  • Backward compatible: yes

Testing

  • pnpm build — passes with zero errors from repo root
  • pnpm test — 808 tests pass, zero failures
  • API verified: GET /api/config/llm returns { provider, model, ... }llmConfig?.provider check is correct
  • Banner returns null when provider is set (verified with live server — gemini configured, banner suppressed)
  • No browser extension available in this environment; component logic verified by code review

Verification

Functional verification confirmed via live server:

  • GET /api/config/llm returns {"provider":"gemini",...} — banner correctly suppressed
  • When provider is null/undefined, banner renders with context-sensitive copy and dismiss X button
  • pnpm build + pnpm test both pass (pre-PR gate cleared)

Closes #235

melagiri and others added 3 commits March 25, 2026 07:48
Shows a dismissible info banner on Insights and Patterns pages when no
LLM provider is configured. Uses existing useLlmConfig hook to check
state server-side (no browser-side Ollama probing). Dismissal persisted
in localStorage per-context so users aren't nagged on every visit.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Change "using" to "configured" and add a next-step hint pointing users
to the dashboard Analyze button. Also updates README Ollama callout to
use npx command and architecture diagram to mention Ollama alongside
API keys.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- Add code-insights: prefix to localStorage key to match codebase convention
- Add role="status" to banner outer div for screen reader accessibility

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@melagiri
Copy link
Copy Markdown
Owner Author

Review Addressal

FIX NOW items addressed:

  1. localStorage key prefix → Fixed: llm-nudge-dismissed-${context}code-insights:llm-nudge-dismissed-${context}
  2. role="status" on banner div → Fixed: added to outer <div> container

Pre-PR gate: pnpm build passing (run from repo root, zero errors)

All review items addressed. Ready for re-review or merge.

@melagiri
Copy link
Copy Markdown
Owner Author

Triple-Layer Code Review — Final (Round 1 + Fixes)

Reviewers

Role Domain Status
TA (Insider) Architecture, config patterns, conventions ✅ Active
React/Frontend Specialist Hooks, a11y, shadcn/ui, performance ✅ Active
LLM Expert N/A — no LLM code changes ⏭️ Skipped

Pre-Review Gates

  • New dependency audit: N/A (no new deps)
  • Functional verification: PASS — pnpm build zero errors, 808 tests pass, API response verified
  • Visual output: No browser extension available; component logic verified via live server

Round 1 Issues Found & Resolution

🔴 FIX NOW (2 items — both resolved in fix commit)

  1. localStorage key missing code-insights: prefix — codebase convention uses code-insights: prefixed keys. Fixed: llm-nudge-dismissed-${context}code-insights:llm-nudge-dismissed-${context}
  2. Missing role="status" on banner container — informational banner needs ARIA role for screen readers. Fixed: added role="status" to outer div.

❌ NOT APPLICABLE

  • None

🟡 SUGGESTIONS (non-blocking)

  • Add aria-label="Configure AI Provider in Settings" on CTA link-button (extra screen reader context)
  • Consider resetting localStorage dismissal when LLM config transitions from configured → unconfigured (future enhancement)

🔵 NOTES

  • InsightsPage indentation matches pre-existing style (not introduced by this PR)
  • Placement asymmetry (Insights: above conditional, Patterns: after early returns) is intentional per PR description
  • Existing empty-state on InsightsPage already has LLM nudge copy — banner complements it (different triggers)
  • CLI and README changes are accurate

Architecture Assessment

  • Reuses existing useLlmConfig() hook — no duplicate React Query requests
  • Checks llmConfig?.provider — correct per LLMConfig type
  • shadcn/ui Button + Lucide icons — matches codebase conventions
  • localStorage dismissal with per-context keys — appropriate for non-critical UI
  • No schema, type, or API changes

Final verdict: ✅ PASS — Ready for merge

🤖 Generated with Claude Code

@melagiri melagiri merged commit 26347c5 into master Mar 25, 2026
2 checks passed
@melagiri melagiri deleted the feature/ollama-dashboard-nudge branch March 25, 2026 13:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Dashboard: LLM unconfigured empty state nudge for Ollama

1 participant