Two families are suing AI chatbot company Character.AI for allegedly encouraging harm after the kids became emotionally attached to the bots. One... Lawsuit: A chatbot hinted a kid should kill his ...
A child in Texas was 9 years old when she first used the chatbot service Character.AI. It exposed her to "hypersexualized content," causing her to develop "sexualized behaviors prematurely." A chatbot ...