Connected toys with Bluetooth, wi-fi and mobile apps may seem like the perfect gift for your child this Christmas. But we’ve found that, without appropriate safety features, they can also pose a big risk to your child’s safety.
Watch our video below to see just how easy it is for anyone to take over the voice control of a popular connected toy, and speak directly to your child through it. And we’re not talking professional hackers. It’s easy enough for almost anyone to do.
But the robot in our video below is not the only connected toy parents need to be wary of this Christmas. Read on for the security breaches discovered in the popular Furby toy and see how easy it was for us to hack into a cute CloudPets cat.
With toys like these and other connected toys expected to be popular around Black Friday and Christmas, we’re calling for smart toys to be made secure, or taken off sale entirely.
Connected toys safety
Over the past 12 months, Which?, in collaboration with consumer organisations and security research experts, has conducted investigations into popular Bluetooth or wi-fi toys on sale at major retailers. This has revealed concerning vulnerabilities in several devices that could enable anyone to effectively talk to a child through their toy. Here, we present findings on just four – the Furby Connect, I-Que Intelligent Robot, Toy-fi Teddy, and CloudPets cuddly toy:
- In all cases, it was found to be far too easy for someone to use the toy to talk to a child.
- Each time, the Bluetooth connection had not been secured, meaning that person didn’t need a password, Pin code or any other authentication to get access. That person would need hardly any technical know-how to ‘hack’ your child’s toy.
- Bluetooth has a range limit, usually 10 metres, so the immediate concern would be someone with malicious intentions nearby. However, there are methods for extending Bluetooth range, and it’s possible someone could set up a mobile system in a vehicle to trawl the streets hunting for unsecured toys.
Read our safety advice on how to keep your child safe if you’re buying a connected toy this Christmas
Connected toys that are easily hacked
I-Que Intelligent Robot
Available from: Argos, Hamleys, online
Made by Genesis Toys, this brightly coloured robot talks back to you, spits sound effects and can even tell some (pretty dire) jokes.
The German consumer organisation, Stiftung Warentest, found that it uses Bluetooth to pair with a phone or tablet, but the connection is unsecured. In fact, anyone can download the app, find an i-Que within Bluetooth range and start chatting by typing into a text field (see more in the video report above).
Worse still, the robot speaks in its own voice and so, if the child has played with it for a while, they could be more willing to trust it.
Vivid Toys, UK distributor of i-Que, told us that it takes reports of security issues with the i-Que ‘very seriously’, although it said that ‘there have been no reports of these products being used in a malicious way’. Vivid said that it will take our recommendation about adding Bluetooth authentication to Genesis Toys and ‘actively pursue this matter with them directly’. It added: ‘The connected toys distributed by Vivid fully comply with essential requirements of the Toy Safety Directive and harmonised European standards, and (we) consider these product to be safe for consumers to use when following the user instructions.’
Available from: Argos, Amazon, Toys R Us, Smyths
We asked information security experts, Context IS, to assess the security of the popular Furby Connect talking toy – and the news wasn’t good. Just like the i-Que, anyone within Bluetooth range can connect to the toy when its switched on, with no physical interaction required. This is because it does not use any security features when pairing. Plus, you can make the connection via a laptop, opening up more opportunities to control the toy.
Context IS was able to build upon some previous work by Florian Euchner (see https://github.com/Jeija/bluefluff) to upload and play a custom audio file on the Furby. This audio file could be anything, including inappropriate material. While we could not turn the Furby into a listening device in the time we had, Context IS believes this is possible if someone was able to re-engineer its firmware due to another vulnerability found in the toy’s design (which we will not be publishing).
Context IS feels it is possible to add more security to the toy via the standard Bluetooth bonding procedure that exchanges an encryption key (LTK) with the phone or tablet during initial set-up. It is possible to remove the firmware vulnerability, too.
Furby-maker Hasbro told us that while it takes our report ‘very seriously’, it feels that the vulnerabilities we’ve exposed would require someone to be in close proximity to the toy and posses the technical knowledge to re-engineer the firmware.
‘We feel confident in the way we have designed both the toy and the app to deliver a secure play experience,’ the firm added. ‘The Furby Connect toy and Furby Connect World app were not designed to collect users’ name, address, online contact information (eg, user name, email address, etc.) or to permit users to create profiles to allow Hasbro to personally identify them, and the experience does not record your voice or otherwise use your device’s microphone.’
Available from: Amazon, online
CloudPets is a stuffed toy that enables family and friends to send messages to a child, played back on a built-in speaker. It comes in dog, bunny, cat and bear varieties. With some knowledge, someone can hack the toy and make it play their own voice messages.
In a previous investigation, we hacked the kitten version and made it order itself some cat food from a nearby Amazon Echo (see more in the video below). We were able to connect to the toy’s unsecured Bluetooth connection from even outside in the street.
CloudPets maker, Spiral Toys, has not yet made a public comment on CloudPets’ Bluetooth vulnerabilities. However, it did respond about a separate data breach earlier in 2017, stating: ‘Protecting our user’s privacy is very important to us, particularly when children are involved. We’re taking several steps to make sure that your account and recordings are safe.’
With the regards the Echo, Amazon told us: ‘To shop with Alexa, customers must ask Alexa to order a product and then confirm the purchase with a “yes” response to purchase via voice. If you asked Alexa to order something by accident, simply say “no” when asked to confirm. You can also manage your shopping settings in the Alexa app, such as turning off voice purchasing or requiring a confirmation code before every order. Additionally, orders placed with Alexa for physical products are eligible for free returns.’
Available from: Amazon, online
This cuddly, cute-looking teddy with a red heart on its chest enables the child to send and receive personal recorded messages over Bluetooth via a smartphone or tablet app. However, again, Stiftung Warentest found that the Bluetooth lacks any authentication protections, meaning strangers can also send their voice messages to the child, and receive answers back.
Toy-Fi is also made by Spiral Toys, which has not commented on the vulnerability.
Stiftung Warentest has also tested the Wowee Chip, which has the same Bluetooth vulnerabilities but hackers can only take remote control of the toy, not speak to the child. It looked at the Fisher-Price Smart Toy Bear and Mattel Hello Barbie to test for for security issues, too. The findings weren’t as concerning as those above, but both toys have hit the media previously with alleged hacking risks.
Should connected toys be banned?
These connected toys all have security issues, but this is just the tip of a very worrying iceberg. Other countries have started to act to ensure children are kept safe, we’d like the UK to follow suit.
My Friend Cayla
Last year, Germany’s telecoms watchdog ordered parents with the My Friend Cayla talking to doll to destroy it as it could be used to ‘illegally spy’ on children. This followed researchers and consumer groups having expressed concern that access to the doll was completely unsecured, in a similar way to the findings above.
The German Federal Network Agency classified Cayla as an ‘illegal espionage apparatus’, this means that in Germany retailers could be fined if they continued to sell it, or failed to disable its wireless connection before sale.
Investigation work on the Cayla doll was done by Tim Medin and Ken Munro from Pen Test Partners. Like the I-Que, My Friend Cayla is manufactured by Genesis Toys and distributed in Europe by the Vivid Toy Group.
In 2016, the Norwegian Consumer Council (Forbrukerrådet) – which recently exposed issues with kids smartwatches – filed complaints in Norway against both the i-Que and Cayla after running its own investigation. Our US colleagues, Consumer Reports, has also previously filed complaints in America about both toys.
In July 2017, the FBI took the important step of issuing a warning about connected toys in general, stating that: ‘Security safeguards for these toys can be overlooked in the rush to market them and to make them easy to use.’
In the cases featured above, the security could have been increased with proper authentication on the Bluetooth connection. With toys like the Furby, this is possible via a firmware update, but it would be better if this was incorporated into the design process before the toys were released.
Connected toys: What we’re calling for
In 1967, Which? successfully campaigned to promote the use of lead-free paint in toys. Some 50 years on and we feel unsecured connected toys pose a different, but equally important risk to children.
We’re calling for all connected toys with proven security or privacy issues to be taken off sale.
Alex Neill, Which? Managing Director of Home Products and Services, said: “Connected toys are becoming increasingly popular, but as our investigation shows, anyone considering buying one should apply a level of caution.
“Safety and security should be the absolute priority with any toy. If that can’t be guaranteed, then the products should not be sold.”