Hacker News new | past | comments | ask | show | jobs | submit | paddlesteamer's comments login

> In completely unrelated news, upcoming versions of Signal will be periodically fetching files to place in app storage. These files are never used for anything inside Signal and never interact with Signal software or data, but they look nice, and aesthetics are important in software.

I wish I could see those files in action...


I wonder if the intention here is to deter Cellebrite from parsing Signal files? Or to pressure them into fixing their security vulnerabilities?


Files will only be returned for accounts that have been active installs for some time already, and only probabilistically in low percentages based on phone number sharding. We have a few different versions of files that we think are aesthetically pleasing, and will iterate through those slowly over time.

Pretty sure it's the former, since the above is a way to ensure that Cellebrite can't just gather all implied exploit files and make sure they've got those specific problems all patched. This is, quite literally, an informational attempt at guerilla/asymmetric warfare, where Signal is trying to make engaging with them too costly, while also making a few blows quite a bit above their weight level. Cellebrite now has to decide whether to keep after this adversary that both is hard to pin down, ambushes them, and has shown it can hit them really hard where it matters (credibility, and thus their pocket book).


This indeed looks like a FUD statement, implying that they can have an infinite amount of potential vulnerabilities. Realistically though, writing parsers that do not yield control of your whole device is not that complex. The people exploiting iOS zero days can certainly do it.


You're not wrong at all, but if they're shipping these garbage ancient versions of ffmpeg, there are likely oodles of other bugs lurking around. And, if Cellebrite acts like most other companies who've had their awful security exposed, they will fix only this bug and leave everything else.


It's not that hard but neither is shipping patched versions of ffmpeg. This company will have some catching up to do.


But it might be easier for Cellebrite to just stop exfiltrating data from Signal. Of course, other apps could discover similar vulnerabilities.


That's not enough. With file system permission, Signal could place files anywhere (like prepared gifs in the Pictures folder).

I think this taints any phone having Signal installed.


the signal are capable for finding more exploit with more time. important piece is that exists now a reasonable doubt on data from the celebrite, so it are not so good for evedince.


Nah, Cellebrite will panic for a bit at the possibility of facing repercussions but ultimately not commit enough effort to change anything. Cellebrite's counterparties, however, might not be so complacent.


Signal should generalise this into a library so that other app vendors can include these perfectly cromulant files


That would reveal all the exploits to Cellebrite, which Signal is trying to avoid.


I imagine many brother app vendors, who may or may not maintain good relationships with Signal might possibly have found a usb drive containing the relevant data on the street. (pure speculation, i don't know anything about moxie, but judging by his tone, i wouldn't be shocked)


hehe.

Now imagine if Hack Back laws actually passed... companies like Whisper Systems would have had impunity for even more shenanigans :)


or just flipping them off, which seems OK too.


I don't get it, can anyone elaborate on what they are talking about there?


They are implying that future versions of Signal will drop random files on your phone that "may or may not" cause damage to Cellebrite systems.

They are basically putting the threat out that if you use Cellebrite on Signal in the future, you might not get the data you expect, and at worst, it may corrupt the report/evidence.

This also brings into question the chain of custody, as an untrusted device being imaged can alter reports of unrelated devices.


Damn, a chain of custody where the thing in evidence is also part of not only its own chain but also those of other evidence acquired afterwards? I can't imagine what kind of case law exists around that, but I'm sure it's hilarious!


> also those of other evidence acquired afterwards

And prior extracts on the device.


Which is what I don't really understand - it seems like Cellebrite could spin this in their favor so law enforcement would need to purchase a new kit for each device?


Signal is going to start attacking third-party tools once it's installed on your phone.

It's as though Theo decided that OpenSSH should respond to portscanners by trying to pwn the source systems.


No, because that would be active retaliation.

More realistically it is like dropping a file on your private file server DONT_RUN_THIS_BLOWS_UP_YOUR_COMPUTER.exe. You never run it, but maybe somebody exploits your file server, gets all your files, and automatically runs them?

Oh well.


It really is like dropping a file on your private file server DONT_RUN_THIS_BLOWS_UP_YOUR_COMPUTER.exe - but contrary to your expectations, it's not "oh well", if you placed it there with the intent to trap someone who you expect to be looking at your computer, you may well be liable if their computer blows up, there's no significant difference from active retaliation - the consequences are there, the intent is there, the act is there, it's pretty much the same.

Of course, if some criminal exploits your file server, they are not likely to press charges, but if it triggers on law enforcement who have a warrant to scan your fileserver, that's a different issue.

You'd be just as liable as for physical boobytraps on your property, with pretty much the same reasoning.


The beauty though, is that law enforcement now can't even know before plugging in and scanning a device whether they'll actually be pwned.

They have to use the exploit to figure out if the phone can nuke that hardware's usability in the future or integrity of any locally stored, non-offsited data.

UNLESS Cellebrite can produce publically for a court of law proof that any potential exploit isn't a valid concern, which means spilling implementation details about how the device works.

Nobody can continue to shut up AND maintain the status quo. Either everyone clams, and Signal can sow reasonable doubt without challenge, crippling Cellebrite's value as a forensic tool. Or someone has to open up about the details of their tool, which, like it or not, will speak very loudly about the ways and methods behind these exploits.

The Checkmate is implied, and oh my, is it deafening.


> if you placed it there with the intent to trap someone who you expect to be looking at your computer, you may well be liable if their computer blows up

Liable for what? You haven’t promised that the code is safe, and they chose to run it.

> there's no significant difference from active retaliation

There is a significant difference, in active retaliation you choose to attack someone elseks computer, with a trap file the attacker chooses to run files they have stolen from you. Big difference.

> You'd be just as liable as for physical boobytraps on your property, with pretty much the same reasoning.

The reasoning is different, lethal or injurious man traps are prohibited because you don’t respond to trespassing with lethal force and you don’t know who or what may trigger the trap. Man traps that lock the intruder in a room without injuring them are fine, and used in high security installations.


And why shouldn’t OpenSSH do that?


Because I have zero interest in running attack software.


signal wants to pick a fight with a grey company that gets money for cracking apps? not a good idea


They're already picking a fight with Cellebrite simply by existing, as Signal is antithetical to everything that Cellebrite stands for.


buying a safe != killing the guy thats invading your house


I think this would be more like including exploding dye packs in your bags of money.


one could view make of an e2e encrypt app that is cause problem for polices as "not a good idea" but there must be some person for to do it.


I wish there could be a way to see which root site set those cookies. For example I wish we could see twtracker.com supercookies are set in some iframe in twitter.com.


With this, devices that use NRF52 chips are now open to investigators. I think we'll learn of more vulnerabilities of BLE devices whose shitty implementations are hidden in those SoCs. I'm more than excited about the next post about Logitech Pro G mouse.

Making things open is a good thing on society's security.


That is if they have been locked in the first place.

Also with a lot of devices being firmware upgradable, there is little point in enabling read-out protection if you can just download the firmware off the internet. (Unless you want to go through all the hassle of encrypting the firmware image, but most devices won't be doing anything so special to make this worthwhile)


Nordic provides some easy-to-use tools and examples for encrypting and signing firmware images when using a bootloader for in-field updates. I would expect that most products based on the nRF52 that support firmware updates encrypt the image.


Nordic's off-the-shelf firmware upgrade process has signed image verification only. The image itself sent over BLE is not encrypted. So anyone using that right off the bat is in for a nice surprise.


Why would anyone be surprised? I'd be very surprised if my firmware was encrypted without setting any encryption key.


Partially because they call their firmware upgrade process "secure Device Firmware Update (DFU) functionality" (lifted from their documentation). Obviously, an engineer needs to go see the source to see what is actually happening under the hood.


Why do you need Encryption for security? A signature should be enough.

(Don’t conflate security with confidentiality)


Not in the context of enabling trusted binaries being used for updates, but to your original point about reverse engineering unencrypted firmware


It's not common for firmware to be encrypted, just as it's not common for executables on your PC to be encrypted.


Other than Signal, I also recommend Threema. It doesn't rely on mobile numbers, possible to configure to run on your private server, etc. It's just not free (as in beer). Also, it's from Switzerland, a country respects your privacy more than the USA[0].

[0]: https://www.reddit.com/r/privacy/comments/gukg5z/threema_win...


> Also, it's from Switzerland, a country respects your privacy more than the USA

Well gouv.ch might, but Crypto AG was an NSA front for decades so I wouldn't be so certain about the companies.

If I wanted to lure people in on the pretence of security and privacy, Being Swiss would be good bait.


CIA front, I believe.


It doesn't look like they are open source, does it? https://threema.ch/en/faq/source_code


I like how it ignores WEP. Don't use WEP.


Since the Antifa will be designated as a terrorist organization[0], I don't suggest you guys to trust github pages, google photos, drive, etc. Tomorrow there may be a subpoena for the IP addresses who use this tool. It may not be enough proof but it'll cost you a lot of money and time. I'd be using local tools like exiftool or gimp.

[0]: https://twitter.com/realDonaldTrump/status/12671296442282475...


I hate bloated OSs and unfortunately Mac OS is one of them. I know how everyone wants everything to work out of the box and I know it's very natural to want so but I cringe if I find out my OS doing something behind my back. That's why I'd never use Windows, Mac OS, Ubuntu, etc. They all violate my privacy and slow my system to do so.

I use Debian, I like Debian. When I run Wireshark I don't see unknown requests destined to debian.com. That is the definition of simplicity for me. And yes, it doesn't always work out of the box, you have to install some drivers, change configurations but it's getting better and easier. Yet, I'm a software developer so I understand and like that stuff.

> Linux was always a disaster in terms of user experience and isn't improving.

No, you can't define it as a disaster, it's not. If you're an end-user that understands nothing of computers maybe you can but otherwise it's not a disaster. It's just harder and getting easier by day.


Actually , I'm pleased ebay is doing this. It wasn't a new issue but now ebay doing it, it took a lot of attention. It's like disclosing a security issue in WebSocket protocol. Now I'm sure next releases of the most browsers will fix it.


Very nice! I only have one question: I see there are pull-up resistors connected to push buttons. Aren't there built-in pullups on RPi GPIOs? Maybe you can enable them and use fewer elements on the breadboard.


Can confirm that the internal pull up/down are there and most libraries can enable them (alternatively this can be done manually via terminal after each restart).


Every time I ran across these boards, I always think of "Apollo Guidance Computer"[1] which is used on Apollo spacecrafts.

They have 16KB ram 2000MHz cpu freq so I feel like I can build a spaceship witha couple of teensies :D

[1] https://en.m.wikipedia.org/wiki/Apollo_Guidance_Computer


One note: they had a 2.048 MHz CPU, not a 2000 MHz CPU. 2000 MHz only slightly below modern clock speeds.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: