Skip to content

Add package-manager manifests and release automation#26

Merged
sharno merged 3 commits intomainfrom
opencode/proud-eagle
Feb 25, 2026
Merged

Add package-manager manifests and release automation#26
sharno merged 3 commits intomainfrom
opencode/proud-eagle

Conversation

@sharno
Copy link
Owner

@sharno sharno commented Feb 25, 2026

Summary

  • add a packaging/ source-of-truth directory with generated manifests for Homebrew, Scoop, Winget, Chocolatey, and AUR using the latest release metadata
  • add scripts/release_packaging.py to regenerate those manifests from gh release view --json tagName,assets output
  • extend release automation by generating and uploading SHA256SUMS via scripts/release_checksums.py, and add packaging-update.yml to open a PR with refreshed manifests after each published release

Verification

  • python -m py_compile scripts/release_checksums.py scripts/release_packaging.py
  • cargo clippy --all-targets --all-features -- -D warnings

Summary by CodeRabbit

  • New Features

    • Added multi-platform package manager support: Homebrew, Scoop, Winget, Chocolatey, and AUR.
    • Automated release packaging workflow for streamlined updates across all platforms.
    • Release artifact checksum generation for enhanced security verification.
  • Chores

    • Added packaging configuration and automation infrastructure.

@coderabbitai
Copy link

coderabbitai bot commented Feb 25, 2026

Warning

Rate limit exceeded

@sharno has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 0 minutes and 49 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

📥 Commits

Reviewing files that changed from the base of the PR and between fbfa6cd and e8b7faf.

📒 Files selected for processing (5)
  • .github/workflows/packaging-update.yml
  • packaging/aur/PKGBUILD
  • packaging/homebrew/zagel.rb
  • scripts/release_checksums.py
  • scripts/release_packaging.py

Walkthrough

This pull request introduces an automated packaging distribution system for releasing software across multiple package managers. The changes add GitHub Actions workflows that trigger on release publication to automatically generate and update package manifests for Homebrew, Scoop, Winget, Chocolatey, and AUR. Python scripts extract release metadata, generate platform-specific packaging files, and compute SHA256 checksums for artifacts. The workflow creates a pull request with the generated manifests, automating the distribution pipeline.

Sequence Diagram

sequenceDiagram
    participant User as User
    participant GitHub as GitHub Release
    participant Workflow as packaging-update Workflow
    participant Script as Python Script
    participant PkgRepo as Package Repos

    User->>GitHub: Publish Release (vX.Y.Z)
    GitHub->>Workflow: Trigger workflow
    Workflow->>GitHub: Checkout repository
    Workflow->>GitHub: Fetch release metadata & assets
    Workflow->>Script: Execute release_packaging.py
    Script->>Script: Extract Cargo metadata
    Script->>Script: Parse release assets
    Script->>Script: Generate Homebrew formula
    Script->>Script: Generate Scoop manifest
    Script->>Script: Generate Winget manifests
    Script->>Script: Generate Chocolatey manifests
    Script->>Script: Generate AUR PKGBUILD
    Workflow->>GitHub: Create PR with manifests<br/>(automation/packaging-vX.Y.Z)
    PkgRepo->>GitHub: Pull latest manifests
Loading
🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title clearly and concisely summarizes the main additions: package-manager manifests and release automation infrastructure.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch opencode/proud-eagle

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

🧹 Nitpick comments (3)
scripts/release_packaging.py (2)

44-44: Cargo.toml is resolved relative to the current working directory.

If the script is invoked from outside the repository root the lookup silently fails with a FileNotFoundError. Consider anchoring the path to the script file:

♻️ Proposed fix
-    cargo_data = tomllib.loads(Path("Cargo.toml").read_text(encoding="utf-8"))
+    repo_root = Path(__file__).resolve().parent.parent
+    cargo_data = tomllib.loads((repo_root / "Cargo.toml").read_text(encoding="utf-8"))
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/release_packaging.py` at line 44, The lookup for Cargo.toml uses a
relative Path("Cargo.toml") which fails if the script is run from a different
CWD; update the code that builds cargo_data (the tomllib.loads call reading
Path("Cargo.toml")) to resolve the file relative to the script location (e.g.
compute script_dir = Path(__file__).resolve().parent and read script_dir /
"Cargo.toml") so the file is found regardless of current working directory and
raise a clear error if the file is still missing.

11-11: tomllib is a Python 3.11+ standard-library module.

tomllib was added in Python 3.11. If this script is ever run under Python 3.10 or earlier it will raise ModuleNotFoundError. The CI workflow uses ubuntu-latest (Python 3.12+) so it passes today, but the minimum requirement is undocumented. Consider adding a version guard or falling back to tomli:

♻️ Compatibility shim
-import tomllib
+try:
+    import tomllib
+except ImportError:
+    import tomli as tomllib  # type: ignore[no-redef]  # pip install tomli
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/release_packaging.py` at line 11, The script imports the stdlib
module tomllib which exists only on Python 3.11+, so add a compatibility shim to
fall back to the third‑party tomli when tomllib isn't available: wrap the import
in a try/except ImportError and on exception import tomli as tomllib (or assign
tomli.load / tomli.loads to the same names you use), so subsequent code that
calls tomllib.load/loads continues to work on older Pythons; update any
documentation or CI to note tomli as a runtime dependency if used.
scripts/release_checksums.py (1)

40-52: Consider wrapping exceptions to satisfy Ruff TRY003.

Ruff flags both raise FileNotFoundError(...) (Line 41) and raise RuntimeError(...) (Line 52) for carrying long messages outside a custom exception class.

♻️ Proposed refactor
+class ArtifactError(RuntimeError):
+    pass
+
+
 def main() -> int:
     args = parse_args()
     dist_dir = args.dist_dir
     output_path = args.output

     if not dist_dir.exists() or not dist_dir.is_dir():
-        raise FileNotFoundError(f"dist directory does not exist: {dist_dir}")
+        raise FileNotFoundError(dist_dir)

     files = sorted(
         [
             file
             for file in dist_dir.iterdir()
             if file.is_file() and file.name != output_path.name
         ],
         key=lambda file: file.name,
     )
     if not files:
-        raise RuntimeError(f"no release artifacts found in {dist_dir}")
+        raise ArtifactError(f"no release artifacts found in {dist_dir}")
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/release_checksums.py` around lines 40 - 52, Replace the direct raises
of FileNotFoundError and RuntimeError with a custom exception type to satisfy
Ruff TRY003: define a local exception class (e.g., ReleaseArtifactsError) and
raise ReleaseArtifactsError(f"dist directory does not exist: {dist_dir}")
instead of FileNotFoundError, and raise ReleaseArtifactsError(f"no release
artifacts found in {dist_dir}") instead of RuntimeError; keep the same
conditions that check dist_dir and files and reuse the same identifying names
(dist_dir, output_path, files) so the control flow and messages remain unchanged
but the exceptions are wrapped in the new ReleaseArtifactsError type.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/packaging-update.yml:
- Around line 33-46: Avoid interpolating `${{ inputs.tag }}` directly inside the
run shell block to prevent script injection; instead expose the input as an
environment variable and reference that env var in the script. Concretely, add
an env entry (e.g., TAG: ${{ inputs.tag }}) on the step, then in the run block
replace occurrences of `${{ inputs.tag }}` with "$TAG" (keep existing logic
using github.event_name, github.event.release.tag_name and the fallback `gh
release view`), ensure you quote "$TAG" when assigning to `tag` and when writing
to `GITHUB_OUTPUT`, and preserve the null/empty checks and error exit behavior
for `tag`.

In `@packaging/aur/PKGBUILD`:
- Line 7: Replace the non‑SPDX identifier in the PKGBUILD license field and the
hardcoded string in the release-packaging generator: change license=('MIT'
'Apache') to an SPDX expression like license=('MIT OR Apache-2.0') (or a single
string "MIT OR Apache-2.0" if your PKGBUILD generator expects one element) and
update the hardcoded 'Apache' value in scripts/release_packaging.py to
'Apache-2.0' (or emit the full SPDX expression "MIT OR Apache-2.0") so the
PKGBUILD and generator produce the correct SPDX identifier and RFC16
dual-license expression.

In `@packaging/homebrew/zagel.rb`:
- Around line 19-22: The Linux stanza in the Homebrew formula (the on_linux
block) currently serves the x86_64 tarball unconditionally, which will install
an incompatible binary on ARM Linux hosts; update the on_linux block to guard by
CPU architecture (similar to the on_macos ARM/Intel guards) so that: for
CPU.intel (or CPU.arch == :x86_64) it uses the existing x86_64 URL/sha256 and
for ARM (e.g., CPU.arm? or CPU.arch == :arm64) either provide the appropriate
arm64 URL/sha256 or raise a clear install-time error/skip installation; locate
the on_linux block in zagel.rb and add the CPU-based conditional branches
matching the pattern used in the on_macos stanza.

In `@scripts/release_packaging.py`:
- Around line 269-289: In build_aur_pkgbuild, don’t hardcode the license tuple;
read metadata["license"], normalize it to valid SPDX identifiers (e.g., replace
"Apache" with "Apache-2.0") and split/combine license expressions into the
PKGBUILD license tuple or entries so they are valid SPDX tokens; update the
generated license= line to use the normalized identifiers derived from
metadata["license"] (refer to build_aur_pkgbuild, metadata["license"] and the
license= line in the template) so the AUR package always emits correct SPDX
identifiers rather than the hardcoded ('MIT' 'Apache').
- Around line 35-39: The --repository CLI argument added with
parser.add_argument (args.repository) is dead code because main() and the
builder functions read repository from metadata["repository"]; either remove the
parser.add_argument call and the workflow flag entirely, or thread
args.repository through main() into the functions that build manifests (pass the
repository value into the builder functions that currently use
metadata["repository"] and update those functions to prefer the passed-in
repository when non-empty). Ensure you update any callers and documentation
accordingly so args.repository is either eliminated or actually overrides
metadata["repository"] in the manifest generation.

---

Nitpick comments:
In `@scripts/release_checksums.py`:
- Around line 40-52: Replace the direct raises of FileNotFoundError and
RuntimeError with a custom exception type to satisfy Ruff TRY003: define a local
exception class (e.g., ReleaseArtifactsError) and raise
ReleaseArtifactsError(f"dist directory does not exist: {dist_dir}") instead of
FileNotFoundError, and raise ReleaseArtifactsError(f"no release artifacts found
in {dist_dir}") instead of RuntimeError; keep the same conditions that check
dist_dir and files and reuse the same identifying names (dist_dir, output_path,
files) so the control flow and messages remain unchanged but the exceptions are
wrapped in the new ReleaseArtifactsError type.

In `@scripts/release_packaging.py`:
- Line 44: The lookup for Cargo.toml uses a relative Path("Cargo.toml") which
fails if the script is run from a different CWD; update the code that builds
cargo_data (the tomllib.loads call reading Path("Cargo.toml")) to resolve the
file relative to the script location (e.g. compute script_dir =
Path(__file__).resolve().parent and read script_dir / "Cargo.toml") so the file
is found regardless of current working directory and raise a clear error if the
file is still missing.
- Line 11: The script imports the stdlib module tomllib which exists only on
Python 3.11+, so add a compatibility shim to fall back to the third‑party tomli
when tomllib isn't available: wrap the import in a try/except ImportError and on
exception import tomli as tomllib (or assign tomli.load / tomli.loads to the
same names you use), so subsequent code that calls tomllib.load/loads continues
to work on older Pythons; update any documentation or CI to note tomli as a
runtime dependency if used.

ℹ️ Review info

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between ec73551 and fbfa6cd.

📒 Files selected for processing (15)
  • .github/workflows/packaging-update.yml
  • .github/workflows/release.yml
  • .gitignore
  • packaging/README.md
  • packaging/aur/PKGBUILD
  • packaging/chocolatey/tools/chocolateyinstall.ps1
  • packaging/chocolatey/tools/chocolateyuninstall.ps1
  • packaging/chocolatey/zagel.nuspec
  • packaging/homebrew/zagel.rb
  • packaging/scoop/zagel.json
  • packaging/winget/manifests/s/Sharno/Zagel/0.3.0/Sharno.Zagel.installer.yaml
  • packaging/winget/manifests/s/Sharno/Zagel/0.3.0/Sharno.Zagel.locale.en-US.yaml
  • packaging/winget/manifests/s/Sharno/Zagel/0.3.0/Sharno.Zagel.yaml
  • scripts/release_checksums.py
  • scripts/release_packaging.py

Comment on lines 269 to 289
def build_aur_pkgbuild(
version: str, metadata: dict[str, str], linux_asset: ReleaseAsset
) -> str:
return textwrap.dedent(
f"""\
pkgname=zagel-bin
pkgver={version}
pkgrel=1
pkgdesc='{metadata["description"]}'
arch=('x86_64')
url='{metadata["repository"]}'
license=('MIT' 'Apache')
depends=('glibc')
source=('zagel-v${{pkgver}}-x86_64-unknown-linux-gnu.tar.gz::{linux_asset.url}')
sha256sums=('{linux_asset.sha256}')

package() {{
install -Dm755 "${{srcdir}}/zagel" "${{pkgdir}}/usr/bin/zagel"
}}
"""
)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

license is hardcoded as ('MIT' 'Apache') — non-SPDX and not derived from metadata.

Two problems:

  1. Apache is not a valid SPDX identifier; Apache-2.0 is (see also packaging/aur/PKGBUILD Line 7).
  2. The value is hardcoded rather than being derived from metadata["license"], meaning it will silently go stale if the project's license changes.

Arch Linux uses SPDX license identifiers, and packages should refer to the license using its SPDX license identifier from the SPDX identifiers list.

🐛 Proposed fix
-        license=('MIT' 'Apache')
+        license=('{metadata["license"].replace(" OR ", " OR ")}')

Or more correctly, format the Cargo MIT OR Apache-2.0 expression directly as a single SPDX string:

-        license=('MIT' 'Apache')
+        license=('{metadata["license"]}')
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
def build_aur_pkgbuild(
version: str, metadata: dict[str, str], linux_asset: ReleaseAsset
) -> str:
return textwrap.dedent(
f"""\
pkgname=zagel-bin
pkgver={version}
pkgrel=1
pkgdesc='{metadata["description"]}'
arch=('x86_64')
url='{metadata["repository"]}'
license=('MIT' 'Apache')
depends=('glibc')
source=('zagel-v${{pkgver}}-x86_64-unknown-linux-gnu.tar.gz::{linux_asset.url}')
sha256sums=('{linux_asset.sha256}')
package() {{
install -Dm755 "${{srcdir}}/zagel" "${{pkgdir}}/usr/bin/zagel"
}}
"""
)
def build_aur_pkgbuild(
version: str, metadata: dict[str, str], linux_asset: ReleaseAsset
) -> str:
return textwrap.dedent(
f"""\
pkgname=zagel-bin
pkgver={version}
pkgrel=1
pkgdesc='{metadata["description"]}'
arch=('x86_64')
url='{metadata["repository"]}'
license=('{metadata["license"]}')
depends=('glibc')
source=('zagel-v${{pkgver}}-x86_64-unknown-linux-gnu.tar.gz::{linux_asset.url}')
sha256sums=('{linux_asset.sha256}')
package() {{
install -Dm755 "${{srcdir}}/zagel" "${{pkgdir}}/usr/bin/zagel"
}}
"""
)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/release_packaging.py` around lines 269 - 289, In build_aur_pkgbuild,
don’t hardcode the license tuple; read metadata["license"], normalize it to
valid SPDX identifiers (e.g., replace "Apache" with "Apache-2.0") and
split/combine license expressions into the PKGBUILD license tuple or entries so
they are valid SPDX tokens; update the generated license= line to use the
normalized identifiers derived from metadata["license"] (refer to
build_aur_pkgbuild, metadata["license"] and the license= line in the template)
so the AUR package always emits correct SPDX identifiers rather than the
hardcoded ('MIT' 'Apache').

@sharno sharno merged commit fe7ae72 into main Feb 25, 2026
7 checks passed
@sharno sharno deleted the opencode/proud-eagle branch February 25, 2026 06:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant