> In completely unrelated news, upcoming versions of Signal will be periodically fetching files to place in app storage. These files are never used for anything inside Signal and never interact with Signal software or data, but they look nice, and aesthetics are important in software.
Files will only be returned for accounts that have been active installs for some time already, and only probabilistically in low percentages based on phone number sharding. We have a few different versions of files that we think are aesthetically pleasing, and will iterate through those slowly over time.
Pretty sure it's the former, since the above is a way to ensure that Cellebrite can't just gather all implied exploit files and make sure they've got those specific problems all patched. This is, quite literally, an informational attempt at guerilla/asymmetric warfare, where Signal is trying to make engaging with them too costly, while also making a few blows quite a bit above their weight level. Cellebrite now has to decide whether to keep after this adversary that both is hard to pin down, ambushes them, and has shown it can hit them really hard where it matters (credibility, and thus their pocket book).
This indeed looks like a FUD statement, implying that they can have an infinite amount of potential vulnerabilities. Realistically though, writing parsers that do not yield control of your whole device is not that complex. The people exploiting iOS zero days can certainly do it.
You're not wrong at all, but if they're shipping these garbage ancient versions of ffmpeg, there are likely oodles of other bugs lurking around. And, if Cellebrite acts like most other companies who've had their awful security exposed, they will fix only this bug and leave everything else.
the signal are capable for finding more exploit with more time. important piece is that exists now a reasonable doubt on data from the celebrite, so it are not so good for evedince.
Nah, Cellebrite will panic for a bit at the possibility of facing repercussions but ultimately not commit enough effort to change anything. Cellebrite's counterparties, however, might not be so complacent.
I imagine many brother app vendors, who may or may not maintain good relationships with Signal might possibly have found a usb drive containing the relevant data on the street. (pure speculation, i don't know anything about moxie, but judging by his tone, i wouldn't be shocked)
They are implying that future versions of Signal will drop random files on your phone that "may or may not" cause damage to Cellebrite systems.
They are basically putting the threat out that if you use Cellebrite on Signal in the future, you might not get the data you expect, and at worst, it may corrupt the report/evidence.
This also brings into question the chain of custody, as an untrusted device being imaged can alter reports of unrelated devices.
Damn, a chain of custody where the thing in evidence is also part of not only its own chain but also those of other evidence acquired afterwards? I can't imagine what kind of case law exists around that, but I'm sure it's hilarious!
Which is what I don't really understand - it seems like Cellebrite could spin this in their favor so law enforcement would need to purchase a new kit for each device?
More realistically it is like dropping a file on your private file server DONT_RUN_THIS_BLOWS_UP_YOUR_COMPUTER.exe. You never run it, but maybe somebody exploits your file server, gets all your files, and automatically runs them?
It really is like dropping a file on your private file server DONT_RUN_THIS_BLOWS_UP_YOUR_COMPUTER.exe - but contrary to your expectations, it's not "oh well", if you placed it there with the intent to trap someone who you expect to be looking at your computer, you may well be liable if their computer blows up, there's no significant difference from active retaliation - the consequences are there, the intent is there, the act is there, it's pretty much the same.
Of course, if some criminal exploits your file server, they are not likely to press charges, but if it triggers on law enforcement who have a warrant to scan your fileserver, that's a different issue.
You'd be just as liable as for physical boobytraps on your property, with pretty much the same reasoning.
The beauty though, is that law enforcement now can't even know before plugging in and scanning a device whether they'll actually be pwned.
They have to use the exploit to figure out if the phone can nuke that hardware's usability in the future or integrity of any locally stored, non-offsited data.
UNLESS Cellebrite can produce publically for a court of law proof that any potential exploit isn't a valid concern, which means spilling implementation details about how the device works.
Nobody can continue to shut up AND maintain the status quo. Either everyone clams, and Signal can sow reasonable doubt without challenge, crippling Cellebrite's value as a forensic tool. Or someone has to open up about the details of their tool, which, like it or not, will speak very loudly about the ways and methods behind these exploits.
The Checkmate is implied, and oh my, is it deafening.
> if you placed it there with the intent to trap someone who you expect to be looking at your computer, you may well be liable if their computer blows up
Liable for what? You haven’t promised that the code is safe, and they chose to run it.
> there's no significant difference from active retaliation
There is a significant difference, in active retaliation you choose to attack someone elseks computer, with a trap file the attacker chooses to run files they have stolen from you. Big difference.
> You'd be just as liable as for physical boobytraps on your property, with pretty much the same reasoning.
The reasoning is different, lethal or injurious man traps are prohibited because you don’t respond to trespassing with lethal force and you don’t know who or what may trigger the trap. Man traps that lock the intruder in a room without injuring them are fine, and used in high security installations.
I wish there could be a way to see which root site set those cookies. For example I wish we could see twtracker.com supercookies are set in some iframe in twitter.com.
With this, devices that use NRF52 chips are now open to investigators. I think we'll learn of more vulnerabilities of BLE devices whose shitty implementations are hidden in those SoCs. I'm more than excited about the next post about Logitech Pro G mouse.
Making things open is a good thing on society's security.
That is if they have been locked in the first place.
Also with a lot of devices being firmware upgradable, there is little point in enabling read-out protection if you can just download the firmware off the internet. (Unless you want to go through all the hassle of encrypting the firmware image, but most devices won't be doing anything so special to make this worthwhile)
Nordic provides some easy-to-use tools and examples for encrypting and signing firmware images when using a bootloader for in-field updates. I would expect that most products based on the nRF52 that support firmware updates encrypt the image.
Nordic's off-the-shelf firmware upgrade process has signed image verification only. The image itself sent over BLE is not encrypted. So anyone using that right off the bat is in for a nice surprise.
Partially because they call their firmware upgrade process "secure Device Firmware Update (DFU) functionality" (lifted from their documentation). Obviously, an engineer needs to go see the source to see what is actually happening under the hood.
Other than Signal, I also recommend Threema. It doesn't rely on mobile numbers, possible to configure to run on your private server, etc. It's just not free (as in beer). Also, it's from Switzerland, a country respects your privacy more than the USA[0].
Since the Antifa will be designated as a terrorist organization[0], I don't suggest you guys to trust github pages, google photos, drive, etc. Tomorrow there may be a subpoena for the IP addresses who use this tool. It may not be enough proof but it'll cost you a lot of money and time. I'd be using local tools like exiftool or gimp.
I hate bloated OSs and unfortunately Mac OS is one of them. I know how everyone wants everything to work out of the box and I know it's very natural to want so but I cringe if I find out my OS doing something behind my back. That's why I'd never use Windows, Mac OS, Ubuntu, etc. They all violate my privacy and slow my system to do so.
I use Debian, I like Debian. When I run Wireshark I don't see unknown requests destined to debian.com. That is the definition of simplicity for me. And yes, it doesn't always work out of the box, you have to install some drivers, change configurations but it's getting better and easier. Yet, I'm a software developer so I understand and like that stuff.
> Linux was always a disaster in terms of user experience and isn't improving.
No, you can't define it as a disaster, it's not. If you're an end-user that understands nothing of computers maybe you can but otherwise it's not a disaster. It's just harder and getting easier by day.
Actually , I'm pleased ebay is doing this. It wasn't a new issue but now ebay doing it, it took a lot of attention. It's like disclosing a security issue in WebSocket protocol. Now I'm sure next releases of the most browsers will fix it.
Very nice! I only have one question: I see there are pull-up resistors connected to push buttons. Aren't there built-in pullups on RPi GPIOs? Maybe you can enable them and use fewer elements on the breadboard.
Can confirm that the internal pull up/down are there and most libraries can enable them (alternatively this can be done manually via terminal after each restart).
I wish I could see those files in action...