Skip to content
#

ai-audit

Here are 12 public repositories matching this topic...

PolicyBind is an AI Policy-as-Code platform that enables organizations to govern AI usage through a unified model registry, real-time token-based access control, and automated compliance reporting for frameworks like the EU AI Act and NIST AI RMF.

  • Updated Jan 15, 2026
  • Python

AURUM is a security-first Rust ledger framework that combines cryptographic integrity, information-theoretic auditing, and verifiable AI output tracking. It integrates AEON entropy/compressibility scoring with domain-separated BLAKE3 hashing, panic-safe canonical encoding, dual-root block validation, and a minimal HTTP node for verifiable model or

  • Updated Nov 2, 2025
  • Rust

Hardened Public Release of KAIROS invocation governance framework. Includes invocation terms, ethical compliance clauses, regulatory mapping, and sample outputs. Licensed under CC BY-NC-ND 4.0. License: Do not auto-generate via GitHub. Use hardened License.txt

  • Updated Jul 10, 2025
  • Python

ChainWatch is a flight data recorder for multi-step AI systems. It's a CLI-based tool that records every step in an AI decision chain, links them together in order, prevents tampering, and allows you to verify the chain's integrity and replay the full decision flow.

  • Updated Jan 22, 2026
  • Python

Improve this page

Add a description, image, and links to the ai-audit topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the ai-audit topic, visit your repo's landing page and select "manage topics."

Learn more