Coming Soon!
Children deserve better protection online.
Guardiobot is a volunteer-run safety and moderation platform for Discord and Roblox communities which aims to create a safer environment on some of the most popular platforms.
Guardiobot is a volunteer-run safety and moderation platform for Discord and Roblox communities which aims to create a safer environment on some of the most popular platforms.
Roblox employs 3,000 moderators for a platform with over 100 million daily active users. Discord's native tools are reactive — they respond after harm has already occurred. Neither platform can watch every server, every message, every join.
The most common pattern is well documented: contact made on Roblox, conversation moved to Discord, exploitation follows. Predators rely on the gap between platforms. Guardiobot is built to close it.
Why we built thisRoblox and Discord are separate platforms with separate moderation systems. Predators exploit that and move children off-platform to private, unmonitored conversations.
A child is playing a Roblox game. A stranger joins, starts a conversation. It begins with in-game chat — compliments, shared interests, offers to trade items or help.
The conversation shifts. "Let's talk on Discord where it's easier." Roblox's moderation ends. The child is now in a private channel, a server, or a DM — watched by no one.
Without cross-platform awareness, Discord servers have no way to know a user arrived from Roblox, what their behaviour there looked like, or whether they've done this before.
Guardiobot's response
We utilise data from other projects like Rotector who have data on known offenders and give communities the tools they need to take action and protect themselves.
Guardiobot combines automated detection, human review, and cross-platform intelligence into a single service so server owners don't have to choose between them or wire them together themselves.
01
Every message scored for severity. Every join checked against our databases of predators. Threats are acted on before a human has to see them.
02
Flagged reports go to trained Trust & Safety volunteers who review and make a decision on each case.
03
Safety history persists across servers and platforms. A bad actor can't start fresh by hopping to a new server, switching from Roblox to Discord or creating a new account.
Updated every 15 minutes from our live systems.
Discord communities actively running Guardiobot.
Cases reviewed by volunteers in the last 30 days.
Harmful matches automatically stopped in the last 30 days.
The people behind it
Guardiobot isn't 100% automated, and that's intentional. Using AI for things like reports would be fast and more efficient,but it struggles in understanding context and it can be costly for a small project like us to implement. We believe decisions should be made based on understanding, not speed.
Volunteers aren't paid for what they do, they work for us every week because they have the same passion we do. Creating a safer internet.
Purpose-built detection for the threats most common (and rare) in communities where younger audiences are present.
From the server owners and moderators using Guardiobot every day.
Coming Soon!
Coming Soon!
Coming Soon!
Join Guardiobot and help us build a future with safer platforms for everyone.