From what I can surmise as somebody with out kids, parenting within the twenty first century is extremely troublesome. It requires adapting to new know-how that comes at a blistering tempo, forcing you to take dangers it doesn’t matter what decisions you make. Do you threat elevating a Luddite that may’t survive within the trendy world or do you threat the unknown results of adopting new know-how? Agora hopes you’ll select the later after they launch ChooChoo the Dragon.
This story hit my desk within the type of a PR launch proudly saying a partnership between Agora and RiseLink. It “marks the general public debut of a real-time, conversational AI product designed for kids.” I feel the easiest way to explain the product idea is as Teddy Ruxpin, however with ChatGPT.
One other AI-generated render of ChooChoo (📷: Agora)
All the ChooChoo the Dragon photographs are themselves AI-generated and wildly inconsistent, so there isn’t a lot to say about its theoretical bodily design. However ChooChoo appears to be a number physique for Agora’s Convo AI Gadget Package R2, which is a conversational AI {hardware} stack that any producer will have the ability to stuff right into a stuffy.
That {hardware} seems to be based mostly on an SoC (system-on-chip) designed by RiseLink. However the {hardware} itself is irrelevant, as a result of the whole idea is fraught with peril.
The press launch notes that ChooChoo the Dragon is for kids of ages three by six and that it will probably act as a studying associate. Critically, it’ll hearken to the kid and generate responses in real-time, slightly than counting on canned replies and pre-programmed dialog bushes.
That could be a critical trigger for concern, as a result of even the titans of the AI trade admit that they don’t perceive how LLMs (massive language fashions) determine what to say and that they, the creators of these LLMs, have restricted management over the output. In 2023, Google CEO Sundar Pichai famously stated “You already know, you don’t absolutely perceive. And you’ll’t fairly inform why it stated this, or why it bought improper.”
Scott Pelley and Sundar Pichai (📷: 60 Minutes)
In different phrases, no person has an entire understanding of how LLMs suppose or the way to management what they are saying — a widely known drawback within the trade. Is that the sort of know-how you need your impressionable younger little one utilizing?
There are numerous examples of LLMs offering inaccurate info and recommendation, starting from the absurd to the downright harmful. That isn’t relegated to small experimental fashions, however can also be true for the preferred and well-funded fashions in existence.
An article revealed in Scienceby Melanie Mitchell final 12 months goes into nice element on this topic, relaying examples of deceit and misinformation supplied by LLMs created by OpenAI and Anthropic — main firms on this subject. In some check eventualities, LLMs even suggested customers to commit crimes.
Melanie Mitchell (📷: Wikipedia)
Adults could possibly determine and ignore these sorts of blunders, however can kids? And that’s earlier than contemplating the implications of AI psychosis, which is a rising concern for adults and would doubtless be much more harmful to children.
If OpenAI and Anthropic can’t cease that from occurring, it appears uncertain that Agora can. So far as I can inform, Agora doesn’t even develop their very own LLM fashions. Their merchandise are interfaces and depend on fashions from — you guessed it — OpenAI, Anthropic, and others.
Not solely does that imply that Agora lacks management over what their merchandise will say to your children, it additionally implies that they don’t must bear any duty for these conversations. They will move the blame alongside to OpenAI or Anthropic. These firms will then, in flip, let you know that LLMs are experimental and that by selecting to make use of them, you accepted the dangers.
That is all very troubling to me and jogs my memory of one other state of affairs wherein a probably harmful money seize focused kids: the Pigzbe scheme. Pigzbe was a crowdfunded “cryptocurrency for youths” platform and I raised the alarm about it by way of an article right here on Hackster again in 2018. Years later, I made an in-depth video on YouTube that explored the aftermath.
Like these behind Pigzbe, Agora appears to be promoting new and unproven know-how to oldsters which might be determined for a break and that need to give their children an edge within the trendy world.
I don’t blame mother and father for that — parenting within the twenty first century is extremely troublesome, in any case. However I do strongly urge warning relating to merchandise like ChooChoo the Dragon. LLMs and AI basically should not but reliable sufficient to work together with kids on a significant degree.
Teddy Ruxpin (📷: Wikipedia)
For now, think about shopping for old style Teddy Ruxpin as a studying associate in your toddler.
