Skip to content

feat: publish curated marketplace artifacts#35

Merged
kantorcodes merged 1 commit intomainfrom
feat/marketplace-json-curated-list
Apr 3, 2026
Merged

feat: publish curated marketplace artifacts#35
kantorcodes merged 1 commit intomainfrom
feat/marketplace-json-curated-list

Conversation

@kantorcodes
Copy link
Copy Markdown
Member

Purpose

Publish a real Codex repo marketplace for the curated awesome list so private consumers can ingest .agents/plugins/marketplace.json instead of scraping the legacy compatibility JSON.

Changes

  • generate .agents/plugins/marketplace.json from the README list
  • mirror each listed public plugin bundle under plugins/<owner>/<repo>/
  • keep plugins.json as a compatibility export during migration
  • update the sync workflow to keep marketplace artifacts aligned

Verification

  • python3 scripts/generate_plugins_json.py
  • python3 -m py_compile scripts/generate_plugins_json.py
  • verified generated marketplace has 31 entries and 31 mirrored plugin manifests

Signed-off-by: Michael Kantor <6068672+kantorcodes@users.noreply.github.com>
@kantorcodes kantorcodes merged commit 40685f9 into main Apr 3, 2026
4 of 6 checks passed
@kilo-code-bot
Copy link
Copy Markdown

kilo-code-bot bot commented Apr 3, 2026

Code Review Summary

Status: 2 Issues Found | Recommendation: Address before merge

Overview

Severity Count
CRITICAL 0
WARNING 2
SUGGESTION 0
Issue Details (click to expand)

WARNING

File Line Issue
scripts/generate_plugins_json.py 105-111 Unhandled network exceptions in fetch_repo_archive() — HTTP errors, timeouts, and connection failures will crash the script with an uncaught exception. Consider wrapping in try/except with a user-friendly error message.
scripts/generate_plugins_json.py 121-123 Unhandled JSON parsing exceptions in load_manifest() — malformed plugin.json in a mirrored plugin will cause the entire sync to fail. Consider adding validation and error handling.
Other Observations (not in diff)

Issues found in unchanged code that cannot receive inline comments:

File Line Issue
scripts/generate_plugins_json.py 217-219 write_json() does not handle filesystem errors (permission denied, disk full) gracefully.
Files Reviewed (4 files)
  • .agents/plugins/marketplace.json - generated output
  • .github/workflows/validate-plugins.yml - workflow changes
  • README.md - documentation changes
  • scripts/generate_plugins_json.py - core sync script

The PR introduces a solid foundation for publishing curated marketplace artifacts. The main concerns are around error handling in the sync script — network and parsing failures should be handled gracefully to prevent the workflow from failing completely when a single plugin has issues.


Reviewed by minimax-m2.5-20260211 · 848,800 tokens

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a plugin marketplace for Codex and adds several new plugins for persistent memory, task scheduling, and electronics design. The review feedback identifies several technical issues in the KiCad design scripts, specifically pointing out incomplete token expiry handling and redundant authentication logic in the DigiKey synchronization process. Additionally, the feedback highlights fragile CLI argument parsing for backwards compatibility and provides a code suggestion to handle potential errors when reading from standard input.

Comment on lines +659 to +661
if result.get("status") == "not_found" and "No DigiKey results" in result.get("error", ""):
pass

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The token expiry handling logic is incomplete and currently does nothing. Since the DigiKey OAuth token has a short lifetime (10 minutes) and the sync process can take much longer due to the 1-second delay between calls, the script will fail for projects with many components. The "refresh once and retry" logic should be implemented, and it should also be applied to the parallel execution path (line 617).

# Read updates
if args.updates:
updates = json.loads(args.updates.read_text())
elif not sys.stdin.isatty():
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

json.load(sys.stdin) will raise a JSONDecodeError if the redirected input is empty. It's safer to check if the input is empty or wrap the call in a try-except block to provide a cleaner error message.

Suggested change
elif not sys.stdin.isatty():
try:
updates = json.load(sys.stdin)
except json.JSONDecodeError:
print("Error: stdin is empty or not valid JSON", file=sys.stderr)
sys.exit(1)

Comment on lines +84 to +115
def get_oauth_token() -> tuple[str, str] | None:
"""Get DigiKey OAuth token. Returns (token, client_id) or None."""
client_id = os.environ.get("DIGIKEY_CLIENT_ID", "")
client_secret = os.environ.get("DIGIKEY_CLIENT_SECRET", "")
if not client_id or not client_secret:
print("Error: DIGIKEY_CLIENT_ID and DIGIKEY_CLIENT_SECRET environment variables required.",
file=sys.stderr)
print(" Get credentials at developer.digikey.com → My Apps → Create App",
file=sys.stderr)
return None

try:
token_data = urllib.parse.urlencode({
"client_id": client_id,
"client_secret": client_secret,
"grant_type": "client_credentials",
}).encode()
req = urllib.request.Request(
"https://api.digikey.com/v1/oauth2/token",
data=token_data,
headers={"Content-Type": "application/x-www-form-urlencoded"},
)
with urllib.request.urlopen(req, timeout=15) as resp:
token_resp = json.loads(resp.read())
token = token_resp.get("access_token", "")
if not token:
print("Error: Failed to get OAuth token", file=sys.stderr)
return None
return token, client_id
except Exception as e:
print(f"Error: OAuth failed: {e}", file=sys.stderr)
return None
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This file defines its own get_oauth_token which does not implement caching, while the sibling script fetch_datasheet_digikey.py (which is already imported) has a more robust implementation with temp-file caching. Reusing the cached version would avoid redundant authentication calls and help manage the 10-minute token lifetime more effectively across multiple script invocations.

Comment on lines +1161 to +1162
if len(sys.argv) > 1 and sys.argv[1].endswith(".kicad_sch"):
sys.argv.insert(1, "analyze")
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This backwards compatibility logic is fragile because it only works if the .kicad_sch file is the first argument. If a user provides global flags before the filename (e.g., python3 bom_manager.py --recursive schematic.kicad_sch), the check will fail and argparse will raise an error because --recursive is not a valid subcommand.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant