Hacker News new | past | comments | ask | show | jobs | submit login

A guy I work with did a presentation on this product, he is big into reverse engineering bluetooth devices. I can assure you the toys themselves are just as insecure as apparently their infrastructure is.

Seeing it light up and say "destroy all humans" was pretty funny, moreso because there is pretty much zero authentication on them so you could do it from anywhere from your mobile, and the mic can turn on and record without any authentication at all.

sigh internet of things




The "S" in IoT stands for Security.


I took a grad course last semester where one of the groups analyzed a Nest cam and the other analyzed the Mother sensor device. Both were surprisingly quite secure, especially the Mother, which had security features all the way down the stack.


One of my professors worked on the key exchange protocol [0], used in Nest. When discussing that particular point, he was very complimentary of Google's security practices, especially when it comes to Nest. [0] https://blogs.ncl.ac.uk/security/2015/07/28/j-pake-built-int...


There's security features then assurance. Features without assurance of correctness are often bypassed. Im curious what was in those you talk about down the stack.


Some of us do give a shit about security. It's just a shame that it feels like we are the exception to the rule.


We do because we realize what the lack of it entails. So will the general public, eventually. And the only way to get there is if more cases like this start happening. It's a shame they have to learn the hard way but there's no other way. That, or we as an industry act up (in ways I can't even fathom).


The fact that you allude to it suggests you can fathom it in some way. Maybe you don't want to but clearly bad actors can exploit insecure systems and that's especially easy from the inside.


Sure I can, but they're all unrealistic so no point mentioning them. We could for example start boycotting companies that don't take security seriously. But I'm afraid we'd end up with a very, very long list.

Publicity can work wonders. You end-up with sensitive data for kids in the wild, possibly in the hands of perverts, nothing could work better than that in raising awareness for the general public. It's harsh, but it fucking works. So we'll stick with it for now, unless someone comes with a better idea.


Lawsuits for criminal negligence against the CEOs of the companies themselves would be a damn good start. Their business practices are why these problems happen. They cut every corner they can find, put business school grads in charge of deciding schedules and resource allocation to engineering, and make sure that if an engineer says 'we need more time and testing' that any low-level manager can tell them business goals come first.


The problem is that it generates awareness based on raw emotion, and of course that always leads to rational, measured decision making. /s

On the other hand, I don't know any better idea either.


Computer/network security will never be important until governments start regulating this stuff through specialized agencies. It's the opposite of profitable to care, so businesses who do care are disadvantaged.


> Computer/network security will never be important until governments start regulating this stuff through specialized agencies.

Well I wouldn't say never, just needs some people determined to have it on the core of the team. Certs are free to low cost depending on the type you want. Compute needed for "Security" is minimal (Heck we can even do RootCA Validation on the the ESP8266 these days).

But this isn't directly connected to the internet and goes via Blue Tooth connection, issues like this are down to lax security practices.

> It's the opposite of profitable to care.

How much profit does it cut to not to put your mongoDB instance internet facing? Firewall off 27017 and Enable Auth shouldn't cut into their profits too much.

EDIT: Slapping a sig creation/check on the content urls shouldn't eat into profits either. This breach had nothing to do with the toy itself but was server side.


>How much profit does it cut to not to put your mongoDB instance internet facing?

Wrong question. How much profit does it cut to hire an engineer who knows not to make your mongoDB instance Internet-facing and to empower that engineer enough that they can tell the CEO that the product is not ready to launch and they can't just open public access to the develop/test environment is the question. And it's not even a matter of profit. It's a matter of pride. Engineers are seen as typists and nerds, low level functionaries. They're the ones who don't understand the divine wisdom of "don't let the perfect be the enemy of the good enough."

You wonder why companies are so stupendously desperate for H1B visas and why so many job listings are looking for 2 years experience and no more? It's because they don't WANT knowledgeable staff. Those tend to be expensive, and problematic.


And they talk in real life to their kinds and not through a toy.


I can see a novelty use that would quickly die off. If you want to talk to your kids via a teddy bear instead of phone or Skype, then that is up to them. The Demo for the toy shows distant relatives using the toy to talk their kids/grand-kids.

But the purpose of the device has nothing to do with the security of the device.


I like Apple's approach, where HomeKit certification requires that the device use some form of secure transport to communicate with iOS.


Which totally would not have helped in this case: using https would still have left the DB exposed.


It doesn't help the server side data leak but at least you can't connect to it and make it say 'destroy all humans'


That's not necessarily the case. TLS protects the connection, but by default does not provide authentication. I also see a lot of instances where certificate checking has been disabled, so that the client just ignores a MitM attack. So with TLS it would seem more secure at first glance, but given the implementation blunders here I wouldn't expect any real improvement.


Yeah that's true, I'm totally assuming 'competently implemented TLS' when I say it would protect the connection.


Hahaha i spent a good 3 minutes looking for where the S was, until the joke hit me


shouldn't that be 'SH' ?


'Security Hardening'.


"Security Hamperred Internet of Things"


Meanwhile police in a murder case are preparing to take Amazon to court for Echo records. On the privacy front, there's just no saving people, but the IoT brings the magic of invading privacy together with furnishing botnets with millions of new bots!

We're screwed coming and going, and the vast majority still look at you like a woodland hermit if you suggest that you shouldn't have anything listening to you in your home.


I wonder how much infrastructure is really required to properly support Alexa like capabilities for an individual. Does Amazon really need all of our recordings on their hardware in their data centers? Is it conceivable that we could own that hardware as well?

I realize that training data is important and I assume the recorded data gets used for that purpose but does Amazon need to keep it forever? How long do they need it? Can I own and posses the hardware and pass off the learning alone?


But even if they say that they're not storing or sending it, how feasible is it to verify that fact?


I don't mean to suggest this is something Amazon would bring to market using an Echo. I mean do you really need Amazon/Apple/Microsoft/Google levels of hardware to run a voice interface for an individual's digital assistant?

Is it feasible to install hardware with the capability of Alexa in the standard user's home?

I think the answer to that is probably "yes", more or less. Some things may be a bit harder to do but I don't see why I need a huge black box in the sky to parse my voice or do some geolocation.


Very feasible -- don't plug it into the Internet.


At which point you have a brick with a microphone, and a pretty blue light, right?


I think the blue lights only show when its connected. You would get pretty red lights


"I'm having trouble understanding you right now."


I have some ideas for a more strict but more user friendly household firewall device and corresponding UI, if something really really needs the Internet.

But your lights, TV, etc. don't live in an Amazon datacenter.


You're right of course, but interesting sidenote: in the murder case I mentioned the police ended up looking at the suspect's smart water meter logs... which showed the use of about 140 gallons of water in the middle of the night. So... no, it doesn't all live in an Amazon datacenter, but it these days it might live somewhere.


Not if it is configured to connect to a server in your home with all the data it needs to function.


Oh of course, but I wonder if by the time you'd created that system, you'd feel the result was worth it? Is what Echo offers really so valuable you'd go through the trouble? I wouldn't.


I don't own an Echo because I don't feel the convenience of such a device is outweighed by the obvious privacy implications. I would be more likely to use it if it was located within my home but at that point the limiting factor would probably be cost. This is why such a device would have to provide additional functionality.

In a world where it is just expected that you have a personal cloud server in your home Alexa becomes an "app" on that server and the Echo continues life as a very nice speaker/microphone device you can place in a visible area in order to interact with that server.


Although I would only consider this developer ready at the moment, ooen efforts like this have promise, at least you have choice what back end if any https://www.kickstarter.com/projects/seeed/respeaker-an-open...


It requires almost nothing. The training process requires large amounts of test data. But once the system is trained, the actual set of weights needed is tiny and running the recognition itself is cake. The whole reason all voice recognition is server-based is purely for business reasons, to lock people into their service, to provide new sources of consumer data to mine and sell, etc.


What's wrong with the police requesting Echo records? Surely the Echo records requests you make to it. No different than Google recording your search history. And it's pretty reasonable for the police to want to look at that in a murder case. And they got a warrant. I don't see anything sinister in this case at all.


I'll be putting out our blog post about this first thing tomorrow (we had it ready to go for next week, but I think now's a good time to add some fuel to the fire). Essentially the toy uses Bluetooth LE very insecurely and it has a speaker and a microphone. Guess what happens next?

Edit: Demo of the CloudPets functionality using Web Bluetooth https://github.com/pdjstone/cloudpets-web-bluetooth/


Reading and fully comprehending the full contents and implications of https://twitter.com/internetofshit should be required for anyone who is thinking about making an IOT type device.


I do agree that lots of IoT products have terrible security, but is having insecure bluetooth or the likes really a terrible thing for most of these types of products?

I understand that this leak is related to mongodb... and that is terrible, but mostly referring to your bluetooth example.

I mean take bluetooth headphones they are notoriously insecure, but the range in which eavesdropping could take place is pretty small, and for most of us you would just be eavesdropping on our annoying music. Seems reasonable that they save bandwidth on secure transmission of data for higher audio quality. That said I could see an argument the other way, but I'm sure there are more examples where it doesn't seem like a big deal. It would be interesting to hear from someone who thinks I'm dead wrong.


> Seems reasonable that they save bandwidth on secure transmission of data for higher audio quality.

Encrypting a compressed audio stream does not add to the bandwidth, aside from the initial key negotiation.

Furthermore, the bandwidth required for audio of a quality that's indiscernible from the original is negligible when compared to the bandwidth of Bluetooth radios. Ridiculously good audio is 320 kbps, and Bluetooth is easily good for 25 Mbps.

I suppose you could argue that the battery power used to perform this computation is the limiting factor, but a good embedded DSP used to perform the recording and transmission typically have tiny power requirements and hardware encryption routines that don't significantly change the power requirements of the device, as compared to keeping a blue LED blinking or powering an earbud speaker.

No, let's be honest here. The actual limiting factor is engineering time and money that goes into developing these devices as quick and cheaply as possible.


Yea I was thinking the bandwidth limitations would be on the CPU because of data decryption... your points are valid though even with that. It's not a great example. I still think that varying degrees of security are fine with these types of things.


If your threat model for your Bluetooth keyboard doesn't involve, say, an abusive spouse sniffing traffic to see if you're reaching out for help, your threat model is probably biased in favour of wankery like the NSA and not real threats ordinary people face.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: