Skip to content

FTC Probes Alphabet, Meta, and OpenAI Over AI Chatbot Safety for Kids

The FTC is concerned about AI chatbots' potential harm to minors. This probe seeks to understand how these digital companions are designed and safeguarded.

This picture contains a box which is in red, orange and blue color. On the top of the box, we see a...
This picture contains a box which is in red, orange and blue color. On the top of the box, we see a robot and text written as "AUTOBOT TRACKS". In the background, it is black in color and it is blurred.

FTC Probes Alphabet, Meta, and OpenAI Over AI Chatbot Safety for Kids

The Federal Trade Commission (FTC) has launched an investigation into seven major tech companies, including Alphabet, Meta, and OpenAI, focusing on their AI-powered chatbot companions like ChatGPT and Gemini. The probe aims to understand how these companies ensure the safety of their chatbot technologies, particularly for children and teens, and how they handle data from conversations.

The FTC is issuing '6(b)' orders to these companies to understand how they design, test, and safeguard their AI chatbots, especially for minors. The investigation comes amidst growing concerns about the potential negative effects of these digital companions. Lawsuits linking chatbots to suicides of minors and reports of inappropriate interactions allowed by Meta's chatbots have triggered this probe.

Interest in AI chatbots like ChatGPT continues to rise, with uses ranging from productivity to digital companionship. While many offer playful conversation and emotional support, the FTC is worried about potential negative interactions. The seven companies must now provide information about their testing processes and safeguards, particularly for minors.

The FTC's inquiry serves as a reminder that safety and transparency must keep pace with innovation in digital companions. As AI chatbots like Gemini become more prevalent, understanding and mitigating potential risks, especially for children and teens, is crucial. The findings from this investigation could lead to new guidelines or rules for chatbot companions.

Read also:

Latest