Get the latest tech news
Grok may be breaking App Store rules with sexualized AI chatbots, and that’s not the only problem
A new Grok feature touches on the App Store’s rules for sexual content, which is something that Apple has shown it doesn’t mess around with.
In Casey Newton’s testing, Ani was “more than willing to describe virtual sex with the user, including bondage scenes or simply just moaning on command,” which is… inconsistent with a 12+ rating app, to say the least. Even if Apple tightens enforcement, or if Grok proactively changes its age rating, it won’t address a second, potentially more complicated issue: young, emotionally vulnerable users, seem especially susceptible to forming parasocial attachments. And when those interactions inevitably go off the rails, the App Store age rating will be the least of any parent’s concerns (at least until they remember why their kid was allowed to download it in the first place).
Or read this on r/apple