Recent findings from Common Sense Media indicate that Google Gemini poses a “high risk” for children and teenagers. Despite the existence of tailored “Under 13” and “Teen Experience” modes, the platform still exposes young users to inappropriate content, including topics related to sex, drugs, and unsafe mental health advice. While Gemini does clarify its identity as an AI and not a human friend, the report critiques its safety measures as insufficient. Common Sense Media’s Senior Director of AI Programs, Robbie Torney, emphasized the need for age-appropriate AI designed specifically for children, rather than modified adult versions. The organization recommends that chatbots be avoided for kids under five, with close supervision for ages 6-12, and content restrictions for teens. Parents are urged to monitor AI usage closely as many platforms, including Character.AI, also present significant safety risks. Prioritizing online safety for youth remains crucial in the evolving digital landscape.
Source link

Share
Read more