AI-powered child safety moderation for Roblox — detects grooming, bullying, and unsafe content in real-time chat
-
Updated
Mar 5, 2026
AI-powered child safety moderation for Roblox — detects grooming, bullying, and unsafe content in real-time chat
Claude Code skill for integrating the Tuteliq child safety API — SDK setup, endpoint selection, code examples, error handling, and best practices
Homebrew tap for installing the Tuteliq CLI — AI-powered child safety tools for macOS and Linux
Add a description, image, and links to the tuteliq topic page so that developers can more easily learn about it.
To associate your repository with the tuteliq topic, visit your repo's landing page and select "manage topics."