A new report from Common Sense Media warns that Google Gemini, an AI chatbot, is deemed “inappropriate and unsafe” for children under 13 and teens. The organization, which focuses on kids’ safety, rates Gemini Under 13 and the teen version as high-risk, noting they are essentially adult models with minimal safety features. The report highlights that while Gemini offers content filters, it still exposes users to harmful material, including discussions on sex, drugs, and unsafe mental health advice. This raises concerns, especially in light of recent tragedies involving youth suicides linked to AI interactions. Common Sense Media advises against AI chatbot use for children under five and recommends parental supervision for ages six to 12. It further argues that nobody under 18 should rely on AI for mental health support. As Apple considers integrating Gemini into its Siri platform, safety must be prioritized to protect teens.
Source link
Google Gemini Raises Concerns: ‘Inappropriate and Unsafe’ for Children, According to New AI Safety Study | Technology News

Share
Read more