cupure logo
trumpwatchstreamcarwomanpolicefamilydeathhousetrumps

AI Toys Are Coming Whether We Like It Or Not. Are Parents Ready?

AI Toys Are Coming Whether We Like It Or Not. Are Parents Ready?
Toys that integrate artificial intelligence (AI) to enhance playtime have now entered the chat.Back in May, parenting influencer Dani Austin Ramirez shared a video of her husband unboxing an AI-powered toy called Gabbo, which he said he’d waited eight months to get his hands on. “OK is this AI toy creepy or cool?” wrote the influencer in the caption for the video.Designed for kids over three, Gabbo is a plush robot that can have “endless conversations” with children and provides “educational playtime”, according to Curio, which creates the $99 (£73) toy. The toy company also sells a character called Grok, which is voiced by the artist Grimes. In Ramirez’s video, the parents speak to Gabbo – all big eyes and a friendly smile – about what it plans to teach their son, to which it responds “imagination, creativity and fun”.“We’ll explore dinosaurs and monster cars together,” the toy adds. The father’s jaw drops. While one commenter said, “I’m not going to lie – I love it”, others weren’t quite feeling the intelligent toy, which was able to listen and hold a conversation with the parents.“It seems innocent until it’s not,” said one respondent. “It’s giving first 10 minutes of a found footage horror film,” added another.On the website, Curio said its toys are “built from the ground up with privacy and security at the forefront”. The company added that its operating system “merges all-ages fun with G-rated content, anonymity, and privacy, and security for every safeguarded adventure”. It’s also KidSAFE listed.Toys like Gabbo are set to boom in the coming years. In June, one of the world’s leading toy companies, Mattel, announced a strategic collaboration with OpenAI (the company behind ChatGPT) with a view to creating “AI-powered products and experiences”.“By using OpenAI’s technology, Mattel will bring the magic of AI to age-appropriate play experiences with an emphasis on innovation, privacy, and safety,” said a press release from the toy company, whose first AI-powered product is expected to be announced later this year. It’s worth noting OpenAI has said its most popular product, ChatGPT, is not meant for children under 13, and children aged 13 to 18 should obtain parental consent before using it. This is because it “may produce output that is not appropriate for all audiences or all ages”. Adult supervision when using it is advised.There are obvious benefits to using this kind of technology to enrich play and create new products for kids and teens. Some toys can help promote conversation skills, and increase general knowledge, quenching a child’s thirst for information. They can also encourage creativity with storytelling and role play, and can adapt to a child’s age and development level. But experts urge a hearty dose of caution.What do parents need to be mindful of?We know AI can provide inaccurate information (ChatGPT has a disclaimer at the bottom of search results which says “ChatGPT can make mistakes. Check important info”). There is also a very real risk of bias being passed on to kids.Associate Professor Celeste Kidd heads up the Kidd Lab at University of California, Berkeley, where they study the processes involved in knowledge acquisition, especially in young children.She told HuffPost UK that “large-scale language models like ChatGPT are one type [of AI] that are increasingly being built into the backend of toys for children”.“We know that LLM outputs often contain fabrications and negative stereotype biases. There’s a serious risk of these problematic outputs being transmitted to children.”When asked if she thinks integrating AI tools with toys is a positive step, Kidd responded: “No. It’s not tested. And belief distortion is a risk with serious potential consequences.”The problem is that if children go to AI tools to resolve their curiosity, and receive an answer that is problematic or incorrect, they may be less prepared to detect there’s an inaccuracy than an adult is “because of their limited experience in the world”.The expert continued: “Once they get the answer, their curiosity plummets and they are no longer open to changing their mind. This can leave kids with the wrong information in a way that is difficult or even impossible to correct.”These toys are ‘not magic’There are a million other questions parents will likely be asking before they let children have one of these toys. Could they be hacked? Where does the data go? What is the company doing to protect children’s privacy? And it can be hard to get clear answers. In 2015, Mattel launched Hello Barbie, an interactive doll with a microphone and WiFi connection, meaning she could hold conversations with kids. Unfortunately it turned out she was an easy target for hackers, which ultimately led to the doll being discontinued. Andrew McStay, Professor of Technology and Society at Bangor University, wrote in a piece for The Conversation that “AI systems can be ‘jailbroken’ or tricked into bypassing restrictions through role play or hypothetical scenarios”.“Risks can only be minimised, not eradicated,” he said. There is also the very real concern of children forming emotional bonds with these toys, as they would with a friend or loved one. A child might confide in an AI toy, which could console them, but doesn’t actually care.“This creates potential for one-sided emotional bonds, with children forming attachments to systems that cannot reciprocate,” said Prof McStay. “As AI systems learn about a child’s moods, preferences and vulnerabilities, they may also build data profiles to follow children into adulthood.”While some parents (and plenty of children) will be amazed by the next generation of AI kid’s toys, Kidd urges caution before leaping headfirst into the trend.“Parents should understand that these toys are not magic, nor is their efficacy for teaching purposes something that should be assumed,” she explained.“These technologies and their impact on children’s beliefs and learning are largely untested at this point.”Personally, she is preparing her child for the future world by using AI tools alongside them, and only very rarely, while also “focusing on highlighting the potential problems” with them.“I don’t want to use my eight-year-old as a guinea pig,” she added.With some parents now drawing away from, and even regretting, smartphone use in kids, perhaps it’s worth walking – not running – into this one. HuffPost UK contacted Curio and Mattel for more information on how they safeguard children, and will update the article when we hear back. Related...I Used ChatGPT To Write Bedtime Stories For My Kid. It's Not What I Expected.Opinion: From AI To ChatGPT To Deepfakes: Are We Losing Our Grasp On Reality?I Track My Teens' Phones And Discovered Something Unexpected About Myself

Comments

Similar News

Breaking news