- AiNews.com
- Posts
- Google, OpenAI, Roblox, and Discord Launch ROOST for Online Child Safety
Google, OpenAI, Roblox, and Discord Launch ROOST for Online Child Safety
![A digital illustration of children using laptops and tablets in a secure online environment. Floating around them are protective icons like shields, locks, and AI symbols, representing digital safety measures. In the background, the logos of Google, OpenAI, Roblox, and Discord are subtly displayed, highlighting their collaboration on the ROOST initiative. The scene is bright and friendly, emphasizing a safe, tech-enabled space for young internet users.](https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/70f5650f-6cf7-47ee-bfed-0e3e659e2134/Google__OpenAI__Roblox__and_Discord_Launch_ROOST_for_Online_Child_Safety.jpg?t=1739299648)
Image Source: ChatGPT-4o
Google, OpenAI, Roblox, and Discord Launch ROOST for Online Child Safety
Tech giants Google, OpenAI, Roblox, and Discord have joined forces to create a new non-profit organization aimed at enhancing online child safety. The Robust Open Online Safety Tools (ROOST) initiative will provide companies with free, open-source AI tools designed to identify, review, and report child sexual abuse material (CSAM).
The initiative responds to the growing challenges posed by generative AI advancements, which have transformed online environments and heightened the need for robust child protection mechanisms. According to founding ROOST partner and former Google CEO Eric Schmidt, ROOST aims to address “a critical need to accelerate innovation in online child safety.”
Building a Safer Internet Through Collaboration
ROOST’s mission is to make core safety technologies more accessible and transparent. By leveraging AI tools from its founding companies, the initiative seeks to create a unified, open-source solution that other companies can easily adopt.
"Starting with a platform focused on child protection, ROOST’s collaborative, open-source approach will foster innovation and make essential infrastructure more transparent, accessible, and inclusive, with the goal of creating a safer internet for everyone," Schmidt said.
The announcement comes amid increasing regulatory pressure on tech companies to improve child safety on their platforms. Many companies are adopting self-regulation strategies to address these concerns proactively, hoping to avoid stricter government intervention.
Addressing a Growing Problem
The urgency of ROOST’s mission is underscored by alarming statistics from the National Center for Missing and Exploited Children (NCMEC), which reported a 12% increase in suspected child exploitation cases between 2022 and 2023. Platforms like Roblox and Discord have faced repeated criticism for failing to adequately protect children from exploitation and inappropriate content.
Roblox, which as of 2020 had over half of U.S. children on its platform, has been scrutinized for inadequate safeguards against child sexual exploitation and inappropriate content on its platform.
Discord and Roblox were both named in a 2022 lawsuit alleging that they failed to prevent adults from messaging children without proper supervision.
What ROOST Will Offer
Founding members are contributing funding, tools, and expertise to the project. ROOST will focus on combining existing detection and reporting technologies from its member organizations into a streamlined, unified solution. This could include:
API-based AI Moderation Systems: Tools that companies can integrate into their platforms to detect harmful content.
Cross-Platform Information Sharing: Building on projects like Lantern, which Discord joined in 2023 alongside Meta and Google.
Open-Sourced AI Models: Updates to Roblox’s AI for detecting profanity, bullying, sexting, and other inappropriate content in audio clips, set to be open-sourced later this year.
While specific details on how ROOST’s tools will interact with existing CSAM detection systems like Microsoft’s PhotoDNA are still unclear, the initiative promises to bridge gaps in current safety infrastructure.
Discord’s Chief Legal Officer Clint Smith said, “We’re committed to making the entire internet - not just Discord - a better and safer place, especially for young people.”
Industry-Wide Impact and Funding
ROOST has raised over $27 million to support its operations for the first four years, backed by philanthropic organizations like the McGovern Foundation, the Future of Online Trust and Safety Fund, the Knight Foundation, and the AI Collaborative. The organization will also draw on expertise from specialists in child safety, AI, open-source technology, and countering violent extremism.
What This Means
The formation of ROOST represents a significant step in the tech industry’s effort to self-regulate and improve online child safety. By providing open-source AI tools and fostering collaboration among major tech companies, the initiative aims to set new standards for protecting children in digital spaces.
However, the success of ROOST will depend on its ability to integrate these tools effectively across diverse platforms and ensure that they are adaptable to rapidly evolving online threats. While the initiative signals a proactive approach to child safety, it also reflects the increasing pressure tech companies face from both the public and regulators to prioritize user protection over rapid growth.
Editor’s Note: This article was created by Alicia Shapiro, CMO of AiNews.com, with writing, image, and idea-generation support from ChatGPT, an AI assistant. However, the final perspective and editorial choices are solely Alicia Shapiro’s. Special thanks to ChatGPT for assistance with research and editorial support in crafting this article.