Add initial Gitcord engine with GitHub/Discord ingestion and audit-first dry-run workflow#1
Add initial Gitcord engine with GitHub/Discord ingestion and audit-first dry-run workflow#1shubham5080 wants to merge 61 commits intoAOSSIE-Org:mainfrom
Conversation
…rst dry-run workflow - Permission-aware GitHub and Discord readers - Deterministic planning with explicit identity mapping - Audit reports (JSON + Markdown) - Strict mutation gating (dry-run default)
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds a complete offline-first CLI automation engine: configuration models and loader, GitHub/Discord adapters and writers, SQLite storage, orchestrator with scoring/planning/reporting, identity-linking and Discord bot, JSON logging, plugin loader, packaging/entrypoint, extensive docs, and a large pytest suite. Changes
Sequence Diagram(s)sequenceDiagram
participant CLI as CLI
participant Orch as Orchestrator
participant GH as GitHubRestAdapter
participant DB as SqliteStorage
participant Score as WeightedScoreStrategy
participant Plan as PlanningEngine
participant DC as DiscordApiAdapter
participant Report as Reporting
CLI->>Orch: run_once()
activate Orch
Orch->>DB: init_schema()
Orch->>GH: list_contributions(since)
GH-->>Orch: contributions
Orch->>DB: record_contributions(contributions)
Orch->>DB: set_cursor(github)
Orch->>Score: compute_scores(contributions, period_end)
Score-->>Orch: scores
Orch->>DB: upsert_scores(scores)
Orch->>DC: list_member_roles()
DC-->>Orch: member_roles
Orch->>Plan: plan_discord_roles(member_roles, scores, ...)
Plan-->>Orch: discord_plans
Orch->>GH: list_open_issues()
Orch->>GH: list_open_pull_requests()
Orch->>Plan: plan_github_assignments(issues, prs, ...)
Plan-->>Orch: github_plans
Orch->>Report: write_reports(discord_plans, github_plans)
Report-->>Orch: audit_paths
Orch->>GH: apply_plans(github_plans, policy)
Orch->>DC: apply_plans(discord_plans, policy)
deactivate Orch
Estimated code review effort🎯 5 (Critical) | ⏱️ ~120 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 6
🤖 Fix all issues with AI agents
In `@src/ghdcbot/__init__.py`:
- Line 1: Replace the EN DASH character in the module docstring with a standard
ASCII hyphen: edit the docstring in src/ghdcbot/__init__.py (the top-line
triple-quoted string) and change “Discord–GitHub automation engine.” to
“Discord-GitHub automation engine.” so the module-level string uses a normal
hyphen.
In `@src/ghdcbot/adapters/discord/writer.py`:
- Around line 36-44: In _apply_plan, do not default to DELETE for unknown
plan.action values; explicitly handle "add" and "remove" (or "delete") and treat
anything else as invalid: determine method only when plan.action == "add" (set
method="PUT") or plan.action == "remove" (set method="DELETE"), and for other
values call self._log_plan(plan, result="failed (unknown action:
{plan.action})") and return without making the request; ensure you reference
_apply_plan, plan.action, method, and _log_plan when implementing the validation
and logging change.
In `@src/ghdcbot/adapters/storage/sqlite.py`:
- Around line 49-89: Normalize all created_at datetimes to UTC when writing and
when reading: in record_contributions convert event.created_at to an aware UTC
datetime before serializing (use astimezone(timezone.utc)) and store its ISO
string with timezone (or a trailing Z), and in list_contributions parse the
stored timestamp as UTC-aware datetime (or convert the parsed datetime to UTC)
before constructing ContributionEvent; apply the same normalization for any
other methods that persist or query cursor/created_at values (the other similar
block referenced around lines 132-149) so all comparisons use consistent
UTC-aware timestamps.
In `@src/ghdcbot/config/loader.py`:
- Around line 18-25: The load_config function currently only checks
Path.exists() and lets read_text OSError propagate; update load_config to first
ensure the path is a file (use config_path.is_file()) and then wrap the call to
config_path.read_text(encoding="utf-8") in a try/except catching OSError (and/or
IOError), raising ConfigError with a clear message that includes the original
exception (use "raise ConfigError(...) from exc") so unreadable files or
directories produce consistent ConfigError behavior; keep the existing
yaml.YAMLError handling for parsing errors.
In `@src/ghdcbot/engine/orchestrator.py`:
- Around line 38-45: The cursor is being set to period_end which can miss events
arriving during the fetch and cause duplicates; instead, after calling
self.github_reader.list_contributions and before self.storage.set_cursor,
compute the latest ingested timestamp from the returned contributions (e.g.,
max(item.created_at) or equivalent), optionally add a tiny epsilon, and call
self.storage.set_cursor("github", latest_ingested_timestamp) so the cursor
reflects the actual last ingested event; update references around
list_contributions, record_contributions, and set_cursor to use that computed
timestamp rather than period_end.
In `@tests/test_user_repo_fallback.py`:
- Around line 18-41: Replace the direct assignment to loader._ACTIVE_CONFIG in
the test with monkeypatch.setattr so the global is restored after the test;
specifically, use the test's monkeypatch fixture to set loader._ACTIVE_CONFIG to
the prepared config (monkeypatch.setattr(loader, "_ACTIVE_CONFIG", config))
instead of assigning loader._ACTIVE_CONFIG = config, ensuring no cross-test
leakage.
🧹 Nitpick comments (23)
pyproject.toml (1)
1-16: Add[project.scripts]entry point to make the CLI directly invocable.Build system, metadata, and dependencies are well-structured with consistent Python version targeting (3.11). The CLI module at
src/ghdcbot/cli.pyexists with a propermain()entry point. Add the following to enable installation as an executable command:Suggested configuration
[project.scripts] ghdcbot = "ghdcbot.cli:main"Also consider adding a
licensefield for open-source clarity, though this can be deferred.tests/test_repo_filtering.py (1)
21-31: Remove unusedcaplogparameter.The
caplogfixture is declared but not used in this test function. Either remove it or add a log assertion for consistency with the other tests.Suggested fix
-def test_repo_filter_deny_mode(caplog) -> None: +def test_repo_filter_deny_mode() -> None: repos = [ {"name": "repo-a"}, {"name": "repo-b"}, {"name": "repo-c"}, ] repo_filter = RepoFilterConfig(mode="deny", names=["repo-b"]) filtered = _apply_repo_filter(repos, repo_filter, logging.getLogger("test")) assert [repo["name"] for repo in filtered] == ["repo-a", "repo-c"]src/ghdcbot/config/models.py (1)
95-104: Inconsistent default and validation forrole_mappings.The field
role_mappingshasdefault_factory=listsuggesting it's optional with an empty list default, but the validator immediately rejects empty lists. This creates a confusing API where the field appears optional but instantiation without it will fail validation.Consider either:
- Remove the default to make it explicitly required, or
- Keep the default but document that it must be populated
Option 1: Make field explicitly required (preferred)
- role_mappings: list[RoleMappingConfig] = Field(default_factory=list) + role_mappings: list[RoleMappingConfig]tests/test_writer_safety.py (1)
6-11: Silence Ruff ARG002/TRY003 to avoid test lint failures.If Ruff is enforced in CI,
_FailingClientwill trip ARG002 (unused args) and TRY003. Prefer underscore‑prefixed args or a local# noqato keep the intent clear.💡 Suggested tweak
class _FailingClient: - def request(self, *args, **kwargs): + def request(self, *_args, **_kwargs): raise AssertionError("HTTP call should not occur with empty plans") - def post(self, *args, **kwargs): + def post(self, *_args, **_kwargs): raise AssertionError("HTTP call should not occur with empty plans")tests/test_config.py (1)
1-26: Usepytest.raises(ValidationError)instead of catching all exceptions.Catching
Exceptioncan hide unrelated failures. A targetedValidationErrorassertion is clearer and stricter. Line 21–26.♻️ Suggested update
-from ghdcbot.config.models import BotConfig +import pytest +from pydantic import ValidationError +from ghdcbot.config.models import BotConfig @@ - try: - BotConfig.model_validate(payload) - except Exception as exc: # noqa: BLE001 - assert "role_mappings" in str(exc) - else: - raise AssertionError("Expected role_mappings validation error") + with pytest.raises(ValidationError) as excinfo: + BotConfig.model_validate(payload) + assert "role_mappings" in str(excinfo.value)src/ghdcbot/core/errors.py (1)
5-6: Consider renaming to avoid shadowing Python's built-inPermissionError.The custom
PermissionErrorclass shadows the built-in exception, which is poor practice despite being unused in the codebase. If this class is intended for future use, rename it (e.g.,GitcordPermissionError) to maintain clarity and follow naming conventions.tests/test_empty_org_behavior.py (1)
43-46: Mock function signature should include underscore prefix forself.The mock
fake_list_repos_from_pathis being assigned to an instance method viamonkeypatch.setattr(adapter, "_list_repos_from_path", ...). When called on the instance, Python won't passselfsince you're replacing it directly on the instance (not the class), so this is actually correct. However, the unusedpathparameter triggers a linter warning.Consider using
_to silence the unused argument warning:Suggested tweak
- def fake_list_repos_from_path(path: str): + def fake_list_repos_from_path(_path: str): return [], 200README.md (1)
27-38: Add language identifiers to fenced code blocks.The markdownlint warnings about missing language specifiers are valid. Adding language identifiers improves syntax highlighting and accessibility.
Suggested fixes
## Repository Structure -``` +```text src/ghdcbot/ adapters/ # GitHub/Discord/storage adapters (IO)1. Create a virtual environment and install dependencies: -``` +```bash python -m venv .venv2. Export required tokens as environment variables: -``` +```bash export GITHUB_TOKEN="your_github_token"3. Copy and edit the example config: -``` +```bash cp config/example.yaml /tmp/ghdcbot-config.yaml6. Run a dry‑run cycle: -``` +```bash python -m ghdcbot.cli --config /tmp/ghdcbot-config.yaml run-once7. Expected output files: -``` +```text <data_dir>/reports/audit.jsonRun the safety‑focused test suite: -``` +```bash pytestAlso applies to: 52-56, 58-61, 63-65, 69-71, 73-76, 104-106
src/ghdcbot/plugins/registry.py (1)
11-18: Handle edge case where dotted path contains multiple colons.
dotted_path.split(":")will produce more than two elements if the path contains multiple colons (e.g.,"module:Class:extra"), causing the unpacking to fail with aValueError. While this is caught and wrapped inAdapterError, usingsplit(":", 1)would be more explicit about the expected format.Suggested fix
def load_adapter(dotted_path: str) -> type[T]: try: - module_path, class_name = dotted_path.split(":") + module_path, class_name = dotted_path.split(":", 1) module = importlib.import_module(module_path) adapter_cls = getattr(module, class_name) except (ValueError, ImportError, AttributeError) as exc: raise AdapterError(f"Unable to load adapter: {dotted_path}") from exc return adapter_clssrc/ghdcbot/cli.py (2)
45-45: Replace EN DASH with HYPHEN-MINUS for consistency.The string contains an EN DASH (–) which may cause issues with character encoding or searching. Use a standard hyphen-minus (-) instead.
Suggested fix
- parser = argparse.ArgumentParser(description="Discord–GitHub automation engine") + parser = argparse.ArgumentParser(description="Discord-GitHub automation engine")
44-54: Consider adding top-level error handling for better CLI UX.Currently, exceptions from
build_orchestratororrun_oncewill propagate with full tracebacks. For production use, consider catching known exceptions (ConfigError,AdapterError) and providing user-friendly error messages with appropriate exit codes.Suggested improvement
def main() -> None: parser = argparse.ArgumentParser(description="Discord-GitHub automation engine") parser.add_argument("--config", required=True, help="Path to config YAML file") sub = parser.add_subparsers(dest="command", required=True) sub.add_parser("run-once", help="Run a single orchestration cycle") args = parser.parse_args() - orchestrator = build_orchestrator(args.config) - - if args.command == "run-once": - orchestrator.run_once() + try: + orchestrator = build_orchestrator(args.config) + if args.command == "run-once": + orchestrator.run_once() + except Exception as exc: + logging.getLogger("CLI").error("Fatal error: %s", exc) + raise SystemExit(1) from exctests/test_mutation_policy_gating.py (2)
11-16: Consider extracting_FailingClientto a shared test utility.This class duplicates
_FailingClientintests/test_writer_safety.py. Extract to a shared conftest.py or test utilities module to reduce duplication.Suggested approach
In
tests/conftest.py:import pytest class FailingHttpClient: """Test double that fails if any HTTP call is attempted.""" def request(self, *_args, **_kwargs): raise AssertionError("HTTP call should not occur in gated modes") def post(self, *_args, **_kwargs): raise AssertionError("HTTP call should not occur in gated modes") `@pytest.fixture` def failing_client(): return FailingHttpClient()Also, remove the unused
# noqa: D401directives as the D401 rule isn't enabled.
55-59: Test assertion could be more robust.The assertion
record.__dict__.get("result")relies on the internal structure ofLogRecord. Consider usinggetattr(record, "result", None)or accessingrecord.__dict__more explicitly.However, the test correctly validates that:
- No HTTP calls occur (via
_FailingClient)- Appropriate skip reasons are logged for each gating mode
Minor improvement
- assert any(record.__dict__.get("result") == expected for record in caplog.records) + results = [getattr(r, "result", None) for r in caplog.records] + assert expected in results, f"Expected '{expected}' in log results, got {results}"src/ghdcbot/engine/scoring.py (1)
21-24: Add upper bound check forperiod_endto match the intended period boundaries.The current code filters events before
period_startbut ignores events afterperiod_end, even thoughperiod_endis stored in the Score result. Since the Score model explicitly tracks both period boundaries, the filtering should enforce them consistently.Suggested fix
for event in contributions: - if event.created_at < period_start: + if event.created_at < period_start or event.created_at > period_end: continue totals[event.github_user] += self._weights.get(event.event_type, 0)While GitHub API data in practice contains only real event timestamps and won't include future-dated events, this defensive check removes the asymmetry between boundary checks and ensures the period semantics are enforced correctly throughout the codebase.
src/ghdcbot/engine/assignment.py (1)
28-30: Silence the unusedscoresparameter.If
scoresis required by the interface, rename it to_scores(or add_ = scores) to document intentional non‑use and avoid lint noise.♻️ Suggested tweak
- def plan_issue_assignments( - self, issues: Iterable[dict], scores: Sequence[Score] - ) -> Sequence[AssignmentPlan]: + def plan_issue_assignments( + self, issues: Iterable[dict], _scores: Sequence[Score] + ) -> Sequence[AssignmentPlan]: @@ - def plan_review_requests( - self, pull_requests: Iterable[dict], scores: Sequence[Score] - ) -> Sequence[ReviewPlan]: + def plan_review_requests( + self, pull_requests: Iterable[dict], _scores: Sequence[Score] + ) -> Sequence[ReviewPlan]:Also applies to: 48-50
src/ghdcbot/engine/orchestrator.py (1)
92-100: Tidy audit report handling (unusedmd_path+ exception logging).Rename
md_pathto_md_pathand uselogger.exceptionto keep tracebacks.♻️ Suggested tweak
- json_path, md_path = write_reports( + json_path, _md_path = write_reports( discord_plans, github_plans, self.config, repo_count=repo_count ) @@ - except Exception as exc: # noqa: BLE001 - logger.error("Failed to write audit reports", extra={"error": str(exc)}) + except Exception as exc: # noqa: BLE001 + logger.exception("Failed to write audit reports", extra={"error": str(exc)})src/ghdcbot/core/models.py (1)
1-62: Well-structured domain models with good immutability intent.The frozen dataclasses provide a clean, immutable interface for the domain models. A few optional improvements to consider:
Mutable dict fields in frozen dataclasses: Fields like
payloadandsource(lines 14, 51, 62) are dicts which remain mutable even in frozen dataclasses. This is a known Python limitation where the field reference is frozen but contents can still be modified. Consider usingMappingProxyTypeor documenting this behavior if true immutability is needed.Type safety for constrained strings: The
actionandtarget_typefields use string comments to indicate valid values. UsingLiteraltypes would provide compile-time safety:from typing import Literal action: Literal["add", "remove"] target_type: Literal["issue", "pull_request"]src/ghdcbot/adapters/github/rest.py (3)
21-33: Consider adding resource cleanup for the httpx.Client.The
httpx.Clientis created but never explicitly closed. While Python's garbage collector will eventually clean it up, explicit resource management is preferred for HTTP clients to avoid connection leaks, especially in long-running processes.♻️ Suggested approaches
Option 1: Add a close method and use it explicitly:
def close(self) -> None: self._client.close()Option 2: Implement context manager protocol:
def __enter__(self) -> GitHubRestAdapter: return self def __exit__(self, *args) -> None: self._client.close()Option 3: Use
atexitfor cleanup in long-running scenarios.
277-289: Aggressive rate limit cutoff may cause incomplete ingestion.When
remaining <= 1, the method returnsNoneand stops all subsequent requests. This is a safe default but may lead to incomplete data ingestion if the rate limit is low but not fully exhausted.Consider:
- Allowing continuation when
remaining == 1(stop at 0)- Adding a configurable threshold
- Implementing a sleep-until-reset strategy for non-urgent ingestion
The current behavior is safe but worth documenting for operators who may be surprised by partial results.
185-194: Assignee timestamp is approximated fromupdated_at.Using
issue["updated_at"]asassigned_at(line 186) is an approximation—the actual assignment timestamp would require fetching issue timeline events. This trade-off avoids additional API calls but may attribute assignments to incorrect time periods.Consider adding a code comment to document this approximation for future maintainers.
src/ghdcbot/adapters/discord/api.py (2)
16-24: Same resource cleanup consideration as the GitHub adapter.The
httpx.Clientshould be explicitly closed when the adapter is no longer needed. Consider adding aclose()method or implementing the context manager protocol for consistent resource management.
146-151: Consider implementing retry withretry_afterinstead of immediate abort.The
retry_aftervalue from Discord's 429 response is logged but not used. For improved resilience, consider:
- Sleeping for
retry_afterseconds and retrying the request- Or exposing
retry_afterto the caller for upstream handlingCurrent behavior is safe but may cause unnecessary incomplete operations.
♻️ Suggested retry implementation
if response.status_code == 429: retry_after = response.json().get("retry_after", 1) self._logger.warning( "Discord rate limit reached; retrying after delay", extra={"path": path, "retry_after": retry_after}, ) time.sleep(retry_after) return self._request(method, path, params) # Retry onceNote: Add recursion limit or use a loop to prevent infinite retries.
src/ghdcbot/engine/planning.py (1)
43-47: PotentialStopIterationif role not found inrole_thresholds.The
next()call at lines 43-47 (and similarly at lines 61-65) has no default value. Ifroleis not found inrole_thresholds, this will raiseStopIteration, which in a generator context could silently terminate iteration.However, since
desired_rolesis derived fromrole_thresholds(line 27-31), the role should always exist. Consider adding an assertion or default for defensive coding:♻️ Defensive alternative
"threshold": next( - mapping_cfg.min_score - for mapping_cfg in role_thresholds - if mapping_cfg.discord_role == role + (mapping_cfg.min_score + for mapping_cfg in role_thresholds + if mapping_cfg.discord_role == role), + 0, # Should never happen, but safe default ),
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@src/ghdcbot/adapters/discord/writer.py`:
- Around line 19-26: The httpx.Client created in __init__ (self._client) is
never closed; add a proper teardown by implementing an explicit close() method
that calls self._client.close() and/or implement context manager methods
__enter__ and __exit__ on the same class so callers can use "with" to auto-close
the client; update any callers to call close() or use the context manager as
needed to release connection pool resources.
🧹 Nitpick comments (3)
src/ghdcbot/adapters/discord/writer.py (3)
28-34: Consider adding deduplication logic like the GitHub writer.The GitHub writer uses a
seenset to skip duplicate plans. Without this, redundant API calls may occur if duplicate plans are generated, wasting rate limit quota.♻️ Proposed deduplication
def apply_plans(self, plans: Iterable[DiscordRolePlan], policy: MutationPolicy) -> None: + seen: set[tuple[str, str, str]] = set() for plan in plans: skip_reason = _skip_reason(policy, policy.allow_discord_mutations) if skip_reason: self._log_plan(plan, result=skip_reason) continue + dedupe_key = (plan.discord_user_id, plan.role, plan.action) + if dedupe_key in seen: + self._log_plan(plan, result="skipped (duplicate)") + continue + seen.add(dedupe_key) self._apply_plan(plan)
65-88: Role list is fetched on every plan execution, causing redundant API calls.For N plans, this makes N identical requests to
/guilds/{guild_id}/roles. Cache the role mapping to avoid hitting Discord rate limits.♻️ Proposed caching approach
def __init__(self, token: str, guild_id: str) -> None: self._logger = logging.getLogger(self.__class__.__name__) self._guild_id = guild_id self._client = httpx.Client( base_url="https://discord.com/api/v10", headers={"Authorization": f"Bot {token}"}, timeout=30.0, ) + self._role_cache: dict[str, str] | None = None + def _fetch_roles(self) -> dict[str, str] | None: + """Fetch and cache role name -> ID mapping.""" + if self._role_cache is not None: + return self._role_cache + try: + response = self._client.request("GET", f"/guilds/{self._guild_id}/roles") + except httpx.HTTPError as exc: + self._logger.warning("Role lookup failed", extra={"guild_id": self._guild_id, "error": str(exc)}) + return None + if response.status_code != 200: + self._logger.warning("Role lookup failed", extra={"guild_id": self._guild_id, "status": response.status_code}) + return None + self._role_cache = {r["name"]: r["id"] for r in response.json()} + return self._role_cache def _resolve_role_id(self, role_name: str) -> str | None: - """Resolve a role name to a role ID. Returns None if not found.""" - try: - response = self._client.request( - "GET", f"/guilds/{self._guild_id}/roles" - ) - except httpx.HTTPError as exc: - self._logger.warning( - "Role lookup failed", - extra={"guild_id": self._guild_id, "error": str(exc)}, - ) - return None - if response.status_code != 200: - self._logger.warning( - "Role lookup failed", - extra={"guild_id": self._guild_id, "status": response.status_code}, - ) + """Resolve a role name to a role ID. Returns None if not found.""" + roles = self._fetch_roles() + if roles is None: return None - - roles = response.json() - for role in roles: - if role.get("name") == role_name: - return role.get("id") - return None + return roles.get(role_name)
111-118: Duplicate of_skip_reasoningithub/writer.py.This helper is identical to the one in the GitHub writer. Consider extracting it to a shared location (e.g.,
core/modes.py) to avoid duplication.
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
…yping - _ingest_helpful_comments: accept pre-fetched comments to avoid duplicate API pagination - _ingest_issue_comments/_ingest_pr_comments: return (events, comments_by_number) for reuse - _issue_assignment_events: log exceptions with debug (owner, repo, issue_number) - _detect_reverted_pr/_check_pr_ci_status: log exceptions with context instead of bare pass - sqlite: add Any to typing imports for mark_notification_sent - orchestrator: preserve merge-based roles on removal (score_desired | merge_desired) - _send_role_congratulation: generic message 'You have earned the role' - pr_context: return 'Waiting on contributor' when approved but not mergeable Co-authored-by: Cursor <cursoragent@cursor.com>
- Notifications: improved DM messages (issue assign, changes requested, PR approved/merged); dedupe key includes review_id for pr_reviewed; logging for notification flow - Issue assignment: skip already-assigned issues in sync (assignment.py, planning.py); rest.py list_open_issues includes assignees - Bot: mentor-only /assign-issue, /issue-requests, /sync with mentor_check; friendly permission error; issue request approve/replace DM messages with issue title - Add AUDIT_TESTING_POINTS.txt (1060 manual audit test points) and mentor_features_summary.txt Co-authored-by: Cursor <cursoragent@cursor.com>
- Add comprehensive Discord bot commands section with all 12 slash commands - Add instructions for running the Discord bot - Add Gitcord logo alongside AOSSIE logo in README header - Organize commands into Identity Linking, Contribution & Metrics, and Issue Management categories Co-authored-by: Cursor <cursoragent@cursor.com>
Co-authored-by: Cursor <cursoragent@cursor.com>
- Remove all .md files from root and docs/ directory - Keep only README.md as requested - Remove CONTRIBUTING.md, all docs/*.md files, and other documentation files Co-authored-by: Cursor <cursoragent@cursor.com>
Co-authored-by: Cursor <cursoragent@cursor.com>
Summary
src/ghdcbotTesting
Summary by CodeRabbit
New Features
Documentation
Storage & Packaging
Integrations
Tests
Chores