feat(dashboard): LLM unconfigured nudge banner + Ollama UX tweaks#237
Merged
feat(dashboard): LLM unconfigured nudge banner + Ollama UX tweaks#237
Conversation
Shows a dismissible info banner on Insights and Patterns pages when no LLM provider is configured. Uses existing useLlmConfig hook to check state server-side (no browser-side Ollama probing). Dismissal persisted in localStorage per-context so users aren't nagged on every visit. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Change "using" to "configured" and add a next-step hint pointing users to the dashboard Analyze button. Also updates README Ollama callout to use npx command and architecture diagram to mention Ollama alongside API keys. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- Add code-insights: prefix to localStorage key to match codebase convention - Add role="status" to banner outer div for screen reader accessibility Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Owner
Author
Review AddressalFIX NOW items addressed:
Pre-PR gate: All review items addressed. Ready for re-review or merge. |
Owner
Author
Triple-Layer Code Review — Final (Round 1 + Fixes)Reviewers
Pre-Review Gates
Round 1 Issues Found & Resolution🔴 FIX NOW (2 items — both resolved in fix commit)
❌ NOT APPLICABLE
🟡 SUGGESTIONS (non-blocking)
🔵 NOTES
Architecture Assessment
Final verdict: ✅ PASS — Ready for merge 🤖 Generated with Claude Code |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What
Adds a dismissible
LlmNudgeBanneron the Insights and Patterns pages that appears when no LLM provider is configured. Also carries forward deferred UX tweaks from PR #236.Why
Users who open the dashboard without an LLM configured see empty or perpetually-loading AI sections with no explanation. This gives them a clear, non-blocking path to unlock AI features — with Ollama surfaced as the zero-friction local option.
How
dashboard/src/components/LlmNudgeBanner.tsx: fully self-contained, usesuseLlmConfig()hook to check server-side config. Returnsnullwhile loading (prevents flash), returnsnullifprovideris set (won't show when configured). Dismissal stored inlocalStoragewith keyllm-nudge-dismissed-{context}.ollama-detect.ts: "using" → "configured", added next-step hint pointing to the dashboard Analyze button.npx @code-insights/cli, architecture diagram updated to(API key or Ollama).Schema Impact
Testing
pnpm build— passes with zero errors from repo rootpnpm test— 808 tests pass, zero failuresGET /api/config/llmreturns{ provider, model, ... }—llmConfig?.providercheck is correctnullwhen provider is set (verified with live server — gemini configured, banner suppressed)Verification
Functional verification confirmed via live server:
GET /api/config/llmreturns{"provider":"gemini",...}— banner correctly suppressedprovideris null/undefined, banner renders with context-sensitive copy and dismiss X buttonpnpm build+pnpm testboth pass (pre-PR gate cleared)Closes #235