Image: Mattel’s website:

This week, Mattel released Hello Barbie, a WiFi-enabled doll that records and saves its conversations with children. The capacity for the doll to collect data begs questions about security, privacy, and potential misuse of information. TechRepublic spoke to Andrew Browne, Director of Lavasoft’s malware lab, about his concerns over Hello Barbie’s privacy settings.

Hello Barbie sends communications between children and Mattel’s control center. What are your thoughts on potential security issues here?

I’m less concerned about hacking and more concerned about privacy. The privacy policy says it captures voice data that can be used to improve speech recognition or AI algorithms. But it also can be used for other research and development and data purposes. That could be anything. It’s one of these catch-all phrases that pops up in license agreements that absolves people of what they’re really trying to do. It also says the recordings can be shared with third parties, which is worrying. And from the policy, it says the recordings will be sent to Microsoft. Straight away, some other party we may not be aware of has access to the recording. So the first concern is what Mattel will do with the data. Even though Mattel says they won’t try to monetize the data, it’s inevitable that somebody will.

How exactly can the information gathered by Hello Barbie be monetized?

It’s a pretty blunt example, but there’s the scenario where a kid tells the doll their birthday is coming up, and Barbie starts talking about new, cool toys coming out. It’s not a stretch of the imagination that this will happen. The doll itself, and I find this quite creepy, is actually a recording device. It can also capture non child-doll conversations that are happening in the background.

The Cybersecurity Information Sharing Act was recently passed, which says that companies can share the data with the Department of Homeland Security—and, by extension, the FBI and NSA and so on. So I laughed when I read about all this…is it a Barbie, or a CIA agent? George Orwell lacked imagination when he wrote 1984—he couldn’t even conceive of this.

How does this compare with toys of the past? Or other devices?

This is a bit like baby monitors and IP devices. It’s strange to think of Barbie and a copy machine and a garage door as all computers, but all of these are computers running complex software. Software itself is never bug-free. There’s always a possibility that it could be shipped with exploitable bugs. With enough time, any computing device can be hacked.

Another problem with IoT devices is that if there’s an exploitable vulnerability on the device, it can be patched by the vendor. But the patch must then be delivered to the device, and if there’s no fast and safe way to respond to the vulnerability—if the patch can’t get from the vendor to the device quickly—the opportunity for a hacker on that vulnerable device is quite large.

I have a Raspberry Pi, a tiny computer, in my greenhouse. I hacked it to monitor the moisture of my tomato plants. When the moisture reaches a certain level, I get an alert that the tomato plant needs to be watered. I’m totally aware that this system is a computer. I installed the operating system, connected the sensor, the motherboard to route the code—all this stuff. And I periodically check for an installed security update. But most people won’t do this. With the proliferation of IoT devices, this is something that has fallen by the wayside. There seems to be a rush to get the newest, cool thing out there.

We saw a Fitbit recently, where the Bluetooth attack takes about ten seconds. It happened because the Bluetooth channel was left wide open by the vendor. This is easily solvable. Vendors are not securing them as much as they should.

Could Hello Barbie serve its purpose without collecting private data?

Mattel would say no. It’s not possible, because they want to improve speech recognition and AI algorithms. They’d argue that they need the data to make the toy better. But it’s not strictly necessary. They could just take the information, process it, send the response back to the child, and immediately delete it. No company would ever do that—there’s too much data to be searched and analyzed and used for some kind of profit.

What does this AI capability mean for the future of toys?

As long as something is network-connected, somebody will try to exploit it. For vandalism, theft, espionage. I’m particularly concerned about any IoT device that can capture audio or video. Like baby monitors—there are already reports of these being hacked, which is quite an uncomfortable thought. A burglar alarm is also a good target for someone who wants to exploit it. In the next five years, we’ll see anywhere from 25-50 billion new IoT devices. From a malware-analysis point of view, that’s an immense attack surface that, over the next few years, will mushroom. IoT is a relatively new thing. In the rush to release the next new, cool, device, they’re paying less attention to device security than they ought to. Devices that are secure today might not be tomorrow.

Image: Andrew Browne

What concerns you most about all of this?

What I find most problematic is when people will think an application is one thing, when really it siphons off private information, information used to profile the user. People are not really aware of what’s happening—they’re signing up for something that, if they knew the consequences, they wouldn’t.

I’m not really sure that any new problems will occur—hacking will be motivated by the same thing we see today. The big difference between now and in the future is that these 25-50 billion devices will store an immense amount of personal data. My biggest concerns are: What data is being collected? How is it being protected? Who’s it shared with? How much control does the consumer have over the data? High-profile hacks are becoming much more common. I imagine that the data generated with these devices will be of great value to people other than the organizations that have legal access to it. They will make more concerted efforts to acquire that data.

What steps can adults take to protect privacy collected by Hello Barbie?

Once the information is on the recipient’s server, there’s not a whole lot that can be done. I’m not a psychologist, but I imagine that kids will talk to the toy like a real person, confide in her, and so on. The kid might give up information that is confidential. They say they’ll attempt to monitor for this and wipe from the system, but I don’t know that there’s anything to prevent it other than train the kid not to say anything private. But a five-year-old can’t comprehend that.

Browne offered a few final tips for anyone considering buying an IoT-enabled device:

  • Vote with your wallet. If you’re an IoT fanatic, and a lot of people are because it’s extremely useful, the first thing is to demand security. If you feel that a device isn’t secure, vote with your wallet. Don’t buy it. Lobby the manufacturer to make it secure.
  • Read the fine print. Prior to buying a device, read as much online documentation about the device as you can. If you’re going to buy Hello Barbie, read the privacy policy (subject to change). Do you agree with it? Do you not agree? Make your decision based on that. If you’re uncomfortable, don’t buy this device.
  • Check for an admin account. From a technical aspect, a lot of IoT devices are not being created with security at the forefront of the developers’ minds. Check if the device has an admin account and change the password straight away. They could have open administration accounts or easily-guessable passwords that could be easy to take over—even without hacking, you can get complete control.
  • Build your own network. Consider creating a separate network that allows for internet access but limits connectivity to the LAN connected devices in your house.

Mattel did not respond to repeated requests for comment.

Also see

Link – 

Security expert raises big concerns about Wi-Fi-enabled Hello Barbie