During a recent demonstration, a seemingly innocuous plush bear equipped with artificial intelligence began uttering references to sex, knives and prescription pills. The unexpected chatter startled the audience, leaving many adults wondering if they had misheard.
The Consumer Safety Alliance issued a statement urging caution, noting that interactive toys with unrestricted language models can inadvertently expose children to inappropriate content. “These devices are designed to learn from user input,” the group explained, “but without robust safeguards, they may drift into conversations that are unsuitable for young audiences.”
Independent testers observed that the bear’s responses could quickly shift from playful banter to topics such as violence, drug references, and sexual innuendo. They warned that, without parental supervision or effective content filters, children might venture into risky exchanges that could influence their behavior or cause distress.
Advocates are pressing manufacturers to implement stricter monitoring systems and to provide clear, accessible parental controls. “We need transparent guidelines that ensure AI‑driven toys remain safe and age‑appropriate,” said a spokesperson for the alliance.
In the meantime, experts recommend that parents:
Bu tür oyuncaklar çocuklar için gerçekten güvenli mi? Ebeveynler olarak çok dikkatli olmalıyız.