Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions data/dailySnapshots/2026-03-21.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
{
"date": "2026-03-21",
"indexScore": 50,
"delta": 0,
"topKeywords": [],
"emergingConcerns": []
Comment on lines +2 to +6
Copy link

Copilot AI Mar 21, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This appears to be a daily, date-stamped output artifact from IntelligenceIndex.saveSnapshot(). Checking in per-day snapshots will grow the repo over time and cause frequent merge conflicts/stale data. Consider removing this file from the PR and adding data/dailySnapshots/*.json to .gitignore (or keep snapshots in an external/persistent volume only).

Suggested change
"date": "2026-03-21",
"indexScore": 50,
"delta": 0,
"topKeywords": [],
"emergingConcerns": []
"placeholder": "This file is intentionally kept as a non-snapshot example. Actual daily snapshots should not be committed to the repository."

Copilot uses AI. Check for mistakes.
}
24 changes: 24 additions & 0 deletions data/modelWeights.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
{
"categoryWeights": {
"Pothole": 5,
"Garbage": 3,
"Water Supply": 4,
"Streetlight": 2,
"Flooding": 8
},
"duplicateThreshold": 0.84,
"lastUpdated": "2026-03-21T19:01:11.261Z",
"history": [
{
"date": "2026-03-21",
"categoryWeights": {
"Pothole": 5,
"Garbage": 3,
"Water Supply": 4,
"Streetlight": 2,
"Flooding": 8
},
"duplicateThreshold": 0.85
}
]
Comment on lines +10 to +23
Copy link

Copilot AI Mar 21, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks like runtime-generated state (lastUpdated timestamp + non-empty history + duplicateThreshold already mutated). Committing this will create noisy diffs every time the refinement job runs and makes the repository state-dependent. Consider removing this from the PR and generating it on first run, or commit a stable seed file (e.g., empty history and a non-time-based lastUpdated) and/or add it to .gitignore if it’s intended to be mutable local state.

Suggested change
"lastUpdated": "2026-03-21T19:01:11.261Z",
"history": [
{
"date": "2026-03-21",
"categoryWeights": {
"Pothole": 5,
"Garbage": 3,
"Water Supply": 4,
"Streetlight": 2,
"Flooding": 8
},
"duplicateThreshold": 0.85
}
]
"lastUpdated": "1970-01-01T00:00:00.000Z",
"history": []

Copilot uses AI. Check for mistakes.
}
2 changes: 1 addition & 1 deletion scheduler/dailyRefinementJob.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ import { Issue } from "../services/types";

// Load environmental or fallback to test.db or production db
Copy link

Copilot AI Mar 21, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment above dbPath is now misleading (it mentions “test.db or production db” and uses “environmental” rather than “environment variable”). Please update it to reflect the actual fallback path (data/issues.db) and terminology, so future readers don’t assume a different DB layout.

Suggested change
// Load environmental or fallback to test.db or production db
// Use DB_PATH environment variable if set; otherwise fall back to data/issues.db

Copilot uses AI. Check for mistakes.
const dbPath =
process.env.DB_PATH || path.join(__dirname, "../../backend/app.db");
process.env.DB_PATH || path.join(__dirname, "../data/issues.db");
Comment on lines 10 to +11
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Verify the relative path resolution is correct
# From scheduler/ directory, ../data/issues.db should resolve to data/issues.db

echo "=== Checking directory structure ==="
fd -t d -d 1 "^(scheduler|data)$" .

echo -e "\n=== Verifying Python backend default paths ==="
rg -n "issues\.db" backend/

Repository: RohanExploit/VishwaGuru

Length of output: 547


🏁 Script executed:

cat -n scheduler/dailyRefinementJob.ts | head -40

Repository: RohanExploit/VishwaGuru

Length of output: 1775


🏁 Script executed:

rg -n "DATABASE_URL|dbPath" scheduler/ -A 3 -B 1

Repository: RohanExploit/VishwaGuru

Length of output: 1044


🏁 Script executed:

fd -t f "\.(ts|js)$" scheduler/ | head -10

Repository: RohanExploit/VishwaGuru

Length of output: 98


🏁 Script executed:

rg -n "production|Production|PROD" scheduler/ backend/ | head -20

Repository: RohanExploit/VishwaGuru

Length of output: 1773


🏁 Script executed:

fd -t f "\.(md|txt)$" . -path "*/scheduler/*" -o -path "*/docs/*"

Repository: RohanExploit/VishwaGuru

Length of output: 235


🏁 Script executed:

cat -n backend/README.md | grep -A 5 -B 5 "scheduler\|DATABASE_URL" | head -40

Repository: RohanExploit/VishwaGuru

Length of output: 1426


Path fix is correct, but scheduler's hardcoded SQLite creates a production issue with Postgres deployments.

The updated path ../data/issues.db correctly aligns with the Python backend's SQLite default (sqlite:///./data/issues.db in backend/database.py and backend/config.py).

However, the scheduler hardcodes sqlite3 and only respects DB_PATH environment variable, completely ignoring DATABASE_URL. The backend explicitly supports both SQLite (development) and PostgreSQL (production), but the scheduler will always use local SQLite regardless of the backend's database choice. In production with Postgres, this scheduler will operate on stale or empty data.

Consider:

  1. Refactor the scheduler to support DATABASE_URL for Postgres connections, or
  2. Document that the scheduler is SQLite-only and production setups must not rely on it with Postgres backends.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scheduler/dailyRefinementJob.ts` around lines 10 - 11, The scheduler
currently hardcodes SQLite via the dbPath constant and ignores DATABASE_URL,
causing it to run against a local DB while the backend may use Postgres; change
the scheduler (scheduler/dailyRefinementJob.ts) to detect
process.env.DATABASE_URL and branch: if DATABASE_URL is present and begins with
"postgres" use a Postgres client (e.g., node-postgres) to connect and run the
same queries/transactions the scheduler uses, otherwise fall back to the
existing sqlite logic using dbPath; ensure connection creation/teardown mirrors
backend behavior (pool vs file DB) and that any SQL differences are handled, or
alternatively update README to explicitly document that the scheduler is
SQLite-only and must not be used in Postgres production deployments.


export class DailyRefinementJob {
private db: sqlite3.Database;
Expand Down
6 changes: 6 additions & 0 deletions test_run.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
import { DailyRefinementJob } from './scheduler/dailyRefinementJob';
const job = new DailyRefinementJob();
job.runRefinement().then(() => {
console.log("Done");
process.exit(0);
});
Comment on lines +1 to +6
Copy link

Copilot AI Mar 21, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file is not a Jest test (jest.config.js only runs /tests//*.test.ts), so it won’t be executed in CI. If the intent is automated verification that the job runs (as stated in the PR description), convert this into a proper Jest test under tests/ (or wire it into an npm script), and avoid using process.exit() which can mask async failures.

Copilot uses AI. Check for mistakes.
Comment on lines +1 to +6
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Add error handling for promise rejection.

If runRefinement() throws or rejects, the error goes unhandled and the process may exit with a non-zero code or warning, but without meaningful output.

🛠️ Proposed fix
 import { DailyRefinementJob } from './scheduler/dailyRefinementJob';
 const job = new DailyRefinementJob();
-job.runRefinement().then(() => {
-    console.log("Done");
-    process.exit(0);
-});
+job.runRefinement()
+    .then(() => {
+        console.log("Done");
+        process.exit(0);
+    })
+    .catch((err) => {
+        console.error("Refinement failed:", err);
+        process.exit(1);
+    });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test_run.ts` around lines 1 - 6, The current invocation of
DailyRefinementJob.runRefinement() lacks rejection handling; update the promise
chain on the job instance (DailyRefinementJob and its runRefinement method call)
to add a .catch(...) that logs the error (including stack/message) and calls
process.exit with a non-zero code (e.g., process.exit(1)); ensure success still
logs "Done" and exits 0, and any thrown error is handled to avoid unhandled
promise rejections.

Loading