"He started playing on Roblox… When he was 12, he was targeted by an adult sex predator… the man encouraged him to move their conversations to Discord… On Discord, the man increasingly demanded explicit photographs and videos and threatened the child…"
— NBC New York, September 2025, describing a lawsuit against Roblox and Discord following a teen's suicide. View Full Story
There are many other stories like this. It is heartbreaking to see some of the most popular platforms for children do little to combat this — and worse, that young people are often left without the tools or knowledge to protect themselves.
We decided we couldn't watch this happen. So we built Guardiobot — a security service for both Discord and Roblox designed to detect and prevent abusive behaviour automatically, with a human review layer for everything that needs a second opinion.
This exact scenario shaped specific features in Guardiobot:
- Cross-server enforcement — when a bad actor is banned in one server, that ban propagates automatically across every server in our network. Joining any protected server triggers an immediate check for active punishments and alternate accounts.
- Grooming pattern detection — detection rules target escalation tactics: requests to move to DMs or private channels, requests for personal information, isolation attempts, and gradual boundary testing.
- Platform bridging — Guardiobot monitors both Discord and Roblox. Predators frequently use Roblox to identify targets before moving to Discord for exploitation — exactly the pattern described in the NBC News case.
Our goal is to reach communities and developers globally. Every person online has the right to a safe experience, and we are building the infrastructure to make that possible.