Add package-manager manifests and release automation#26
Conversation
|
Warning Rate limit exceeded
⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📒 Files selected for processing (5)
WalkthroughThis pull request introduces an automated packaging distribution system for releasing software across multiple package managers. The changes add GitHub Actions workflows that trigger on release publication to automatically generate and update package manifests for Homebrew, Scoop, Winget, Chocolatey, and AUR. Python scripts extract release metadata, generate platform-specific packaging files, and compute SHA256 checksums for artifacts. The workflow creates a pull request with the generated manifests, automating the distribution pipeline. Sequence DiagramsequenceDiagram
participant User as User
participant GitHub as GitHub Release
participant Workflow as packaging-update Workflow
participant Script as Python Script
participant PkgRepo as Package Repos
User->>GitHub: Publish Release (vX.Y.Z)
GitHub->>Workflow: Trigger workflow
Workflow->>GitHub: Checkout repository
Workflow->>GitHub: Fetch release metadata & assets
Workflow->>Script: Execute release_packaging.py
Script->>Script: Extract Cargo metadata
Script->>Script: Parse release assets
Script->>Script: Generate Homebrew formula
Script->>Script: Generate Scoop manifest
Script->>Script: Generate Winget manifests
Script->>Script: Generate Chocolatey manifests
Script->>Script: Generate AUR PKGBUILD
Workflow->>GitHub: Create PR with manifests<br/>(automation/packaging-vX.Y.Z)
PkgRepo->>GitHub: Pull latest manifests
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 5
🧹 Nitpick comments (3)
scripts/release_packaging.py (2)
44-44:Cargo.tomlis resolved relative to the current working directory.If the script is invoked from outside the repository root the lookup silently fails with a
FileNotFoundError. Consider anchoring the path to the script file:♻️ Proposed fix
- cargo_data = tomllib.loads(Path("Cargo.toml").read_text(encoding="utf-8")) + repo_root = Path(__file__).resolve().parent.parent + cargo_data = tomllib.loads((repo_root / "Cargo.toml").read_text(encoding="utf-8"))🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/release_packaging.py` at line 44, The lookup for Cargo.toml uses a relative Path("Cargo.toml") which fails if the script is run from a different CWD; update the code that builds cargo_data (the tomllib.loads call reading Path("Cargo.toml")) to resolve the file relative to the script location (e.g. compute script_dir = Path(__file__).resolve().parent and read script_dir / "Cargo.toml") so the file is found regardless of current working directory and raise a clear error if the file is still missing.
11-11:tomllibis a Python 3.11+ standard-library module.
tomllibwas added in Python 3.11. If this script is ever run under Python 3.10 or earlier it will raiseModuleNotFoundError. The CI workflow usesubuntu-latest(Python 3.12+) so it passes today, but the minimum requirement is undocumented. Consider adding a version guard or falling back totomli:♻️ Compatibility shim
-import tomllib +try: + import tomllib +except ImportError: + import tomli as tomllib # type: ignore[no-redef] # pip install tomli🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/release_packaging.py` at line 11, The script imports the stdlib module tomllib which exists only on Python 3.11+, so add a compatibility shim to fall back to the third‑party tomli when tomllib isn't available: wrap the import in a try/except ImportError and on exception import tomli as tomllib (or assign tomli.load / tomli.loads to the same names you use), so subsequent code that calls tomllib.load/loads continues to work on older Pythons; update any documentation or CI to note tomli as a runtime dependency if used.scripts/release_checksums.py (1)
40-52: Consider wrapping exceptions to satisfy Ruff TRY003.Ruff flags both
raise FileNotFoundError(...)(Line 41) andraise RuntimeError(...)(Line 52) for carrying long messages outside a custom exception class.♻️ Proposed refactor
+class ArtifactError(RuntimeError): + pass + + def main() -> int: args = parse_args() dist_dir = args.dist_dir output_path = args.output if not dist_dir.exists() or not dist_dir.is_dir(): - raise FileNotFoundError(f"dist directory does not exist: {dist_dir}") + raise FileNotFoundError(dist_dir) files = sorted( [ file for file in dist_dir.iterdir() if file.is_file() and file.name != output_path.name ], key=lambda file: file.name, ) if not files: - raise RuntimeError(f"no release artifacts found in {dist_dir}") + raise ArtifactError(f"no release artifacts found in {dist_dir}")🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/release_checksums.py` around lines 40 - 52, Replace the direct raises of FileNotFoundError and RuntimeError with a custom exception type to satisfy Ruff TRY003: define a local exception class (e.g., ReleaseArtifactsError) and raise ReleaseArtifactsError(f"dist directory does not exist: {dist_dir}") instead of FileNotFoundError, and raise ReleaseArtifactsError(f"no release artifacts found in {dist_dir}") instead of RuntimeError; keep the same conditions that check dist_dir and files and reuse the same identifying names (dist_dir, output_path, files) so the control flow and messages remain unchanged but the exceptions are wrapped in the new ReleaseArtifactsError type.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/packaging-update.yml:
- Around line 33-46: Avoid interpolating `${{ inputs.tag }}` directly inside the
run shell block to prevent script injection; instead expose the input as an
environment variable and reference that env var in the script. Concretely, add
an env entry (e.g., TAG: ${{ inputs.tag }}) on the step, then in the run block
replace occurrences of `${{ inputs.tag }}` with "$TAG" (keep existing logic
using github.event_name, github.event.release.tag_name and the fallback `gh
release view`), ensure you quote "$TAG" when assigning to `tag` and when writing
to `GITHUB_OUTPUT`, and preserve the null/empty checks and error exit behavior
for `tag`.
In `@packaging/aur/PKGBUILD`:
- Line 7: Replace the non‑SPDX identifier in the PKGBUILD license field and the
hardcoded string in the release-packaging generator: change license=('MIT'
'Apache') to an SPDX expression like license=('MIT OR Apache-2.0') (or a single
string "MIT OR Apache-2.0" if your PKGBUILD generator expects one element) and
update the hardcoded 'Apache' value in scripts/release_packaging.py to
'Apache-2.0' (or emit the full SPDX expression "MIT OR Apache-2.0") so the
PKGBUILD and generator produce the correct SPDX identifier and RFC16
dual-license expression.
In `@packaging/homebrew/zagel.rb`:
- Around line 19-22: The Linux stanza in the Homebrew formula (the on_linux
block) currently serves the x86_64 tarball unconditionally, which will install
an incompatible binary on ARM Linux hosts; update the on_linux block to guard by
CPU architecture (similar to the on_macos ARM/Intel guards) so that: for
CPU.intel (or CPU.arch == :x86_64) it uses the existing x86_64 URL/sha256 and
for ARM (e.g., CPU.arm? or CPU.arch == :arm64) either provide the appropriate
arm64 URL/sha256 or raise a clear install-time error/skip installation; locate
the on_linux block in zagel.rb and add the CPU-based conditional branches
matching the pattern used in the on_macos stanza.
In `@scripts/release_packaging.py`:
- Around line 269-289: In build_aur_pkgbuild, don’t hardcode the license tuple;
read metadata["license"], normalize it to valid SPDX identifiers (e.g., replace
"Apache" with "Apache-2.0") and split/combine license expressions into the
PKGBUILD license tuple or entries so they are valid SPDX tokens; update the
generated license= line to use the normalized identifiers derived from
metadata["license"] (refer to build_aur_pkgbuild, metadata["license"] and the
license= line in the template) so the AUR package always emits correct SPDX
identifiers rather than the hardcoded ('MIT' 'Apache').
- Around line 35-39: The --repository CLI argument added with
parser.add_argument (args.repository) is dead code because main() and the
builder functions read repository from metadata["repository"]; either remove the
parser.add_argument call and the workflow flag entirely, or thread
args.repository through main() into the functions that build manifests (pass the
repository value into the builder functions that currently use
metadata["repository"] and update those functions to prefer the passed-in
repository when non-empty). Ensure you update any callers and documentation
accordingly so args.repository is either eliminated or actually overrides
metadata["repository"] in the manifest generation.
---
Nitpick comments:
In `@scripts/release_checksums.py`:
- Around line 40-52: Replace the direct raises of FileNotFoundError and
RuntimeError with a custom exception type to satisfy Ruff TRY003: define a local
exception class (e.g., ReleaseArtifactsError) and raise
ReleaseArtifactsError(f"dist directory does not exist: {dist_dir}") instead of
FileNotFoundError, and raise ReleaseArtifactsError(f"no release artifacts found
in {dist_dir}") instead of RuntimeError; keep the same conditions that check
dist_dir and files and reuse the same identifying names (dist_dir, output_path,
files) so the control flow and messages remain unchanged but the exceptions are
wrapped in the new ReleaseArtifactsError type.
In `@scripts/release_packaging.py`:
- Line 44: The lookup for Cargo.toml uses a relative Path("Cargo.toml") which
fails if the script is run from a different CWD; update the code that builds
cargo_data (the tomllib.loads call reading Path("Cargo.toml")) to resolve the
file relative to the script location (e.g. compute script_dir =
Path(__file__).resolve().parent and read script_dir / "Cargo.toml") so the file
is found regardless of current working directory and raise a clear error if the
file is still missing.
- Line 11: The script imports the stdlib module tomllib which exists only on
Python 3.11+, so add a compatibility shim to fall back to the third‑party tomli
when tomllib isn't available: wrap the import in a try/except ImportError and on
exception import tomli as tomllib (or assign tomli.load / tomli.loads to the
same names you use), so subsequent code that calls tomllib.load/loads continues
to work on older Pythons; update any documentation or CI to note tomli as a
runtime dependency if used.
ℹ️ Review info
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (15)
.github/workflows/packaging-update.yml.github/workflows/release.yml.gitignorepackaging/README.mdpackaging/aur/PKGBUILDpackaging/chocolatey/tools/chocolateyinstall.ps1packaging/chocolatey/tools/chocolateyuninstall.ps1packaging/chocolatey/zagel.nuspecpackaging/homebrew/zagel.rbpackaging/scoop/zagel.jsonpackaging/winget/manifests/s/Sharno/Zagel/0.3.0/Sharno.Zagel.installer.yamlpackaging/winget/manifests/s/Sharno/Zagel/0.3.0/Sharno.Zagel.locale.en-US.yamlpackaging/winget/manifests/s/Sharno/Zagel/0.3.0/Sharno.Zagel.yamlscripts/release_checksums.pyscripts/release_packaging.py
| def build_aur_pkgbuild( | ||
| version: str, metadata: dict[str, str], linux_asset: ReleaseAsset | ||
| ) -> str: | ||
| return textwrap.dedent( | ||
| f"""\ | ||
| pkgname=zagel-bin | ||
| pkgver={version} | ||
| pkgrel=1 | ||
| pkgdesc='{metadata["description"]}' | ||
| arch=('x86_64') | ||
| url='{metadata["repository"]}' | ||
| license=('MIT' 'Apache') | ||
| depends=('glibc') | ||
| source=('zagel-v${{pkgver}}-x86_64-unknown-linux-gnu.tar.gz::{linux_asset.url}') | ||
| sha256sums=('{linux_asset.sha256}') | ||
|
|
||
| package() {{ | ||
| install -Dm755 "${{srcdir}}/zagel" "${{pkgdir}}/usr/bin/zagel" | ||
| }} | ||
| """ | ||
| ) |
There was a problem hiding this comment.
license is hardcoded as ('MIT' 'Apache') — non-SPDX and not derived from metadata.
Two problems:
Apacheis not a valid SPDX identifier;Apache-2.0is (see alsopackaging/aur/PKGBUILDLine 7).- The value is hardcoded rather than being derived from
metadata["license"], meaning it will silently go stale if the project's license changes.
Arch Linux uses SPDX license identifiers, and packages should refer to the license using its SPDX license identifier from the SPDX identifiers list.
🐛 Proposed fix
- license=('MIT' 'Apache')
+ license=('{metadata["license"].replace(" OR ", " OR ")}')Or more correctly, format the Cargo MIT OR Apache-2.0 expression directly as a single SPDX string:
- license=('MIT' 'Apache')
+ license=('{metadata["license"]}')📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| def build_aur_pkgbuild( | |
| version: str, metadata: dict[str, str], linux_asset: ReleaseAsset | |
| ) -> str: | |
| return textwrap.dedent( | |
| f"""\ | |
| pkgname=zagel-bin | |
| pkgver={version} | |
| pkgrel=1 | |
| pkgdesc='{metadata["description"]}' | |
| arch=('x86_64') | |
| url='{metadata["repository"]}' | |
| license=('MIT' 'Apache') | |
| depends=('glibc') | |
| source=('zagel-v${{pkgver}}-x86_64-unknown-linux-gnu.tar.gz::{linux_asset.url}') | |
| sha256sums=('{linux_asset.sha256}') | |
| package() {{ | |
| install -Dm755 "${{srcdir}}/zagel" "${{pkgdir}}/usr/bin/zagel" | |
| }} | |
| """ | |
| ) | |
| def build_aur_pkgbuild( | |
| version: str, metadata: dict[str, str], linux_asset: ReleaseAsset | |
| ) -> str: | |
| return textwrap.dedent( | |
| f"""\ | |
| pkgname=zagel-bin | |
| pkgver={version} | |
| pkgrel=1 | |
| pkgdesc='{metadata["description"]}' | |
| arch=('x86_64') | |
| url='{metadata["repository"]}' | |
| license=('{metadata["license"]}') | |
| depends=('glibc') | |
| source=('zagel-v${{pkgver}}-x86_64-unknown-linux-gnu.tar.gz::{linux_asset.url}') | |
| sha256sums=('{linux_asset.sha256}') | |
| package() {{ | |
| install -Dm755 "${{srcdir}}/zagel" "${{pkgdir}}/usr/bin/zagel" | |
| }} | |
| """ | |
| ) |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@scripts/release_packaging.py` around lines 269 - 289, In build_aur_pkgbuild,
don’t hardcode the license tuple; read metadata["license"], normalize it to
valid SPDX identifiers (e.g., replace "Apache" with "Apache-2.0") and
split/combine license expressions into the PKGBUILD license tuple or entries so
they are valid SPDX tokens; update the generated license= line to use the
normalized identifiers derived from metadata["license"] (refer to
build_aur_pkgbuild, metadata["license"] and the license= line in the template)
so the AUR package always emits correct SPDX identifiers rather than the
hardcoded ('MIT' 'Apache').
Summary
packaging/source-of-truth directory with generated manifests for Homebrew, Scoop, Winget, Chocolatey, and AUR using the latest release metadatascripts/release_packaging.pyto regenerate those manifests fromgh release view --json tagName,assetsoutputSHA256SUMSviascripts/release_checksums.py, and addpackaging-update.ymlto open a PR with refreshed manifests after each published releaseVerification
python -m py_compile scripts/release_checksums.py scripts/release_packaging.pycargo clippy --all-targets --all-features -- -D warningsSummary by CodeRabbit
New Features
Chores