I believe you’re limited to 3 apps this way, as well as being limited to 10 App IDs (most popular apps will include multiple App IDs for things like extensions, widgets, etc).
If you have the $99/yr Apple Developer account that the user mentioned, those restrictions don’t apply.
It’s worth noting that this (proposed) law already only applies to large companies, as I noted upthread. This is something that a lot of folks in this thread are missing, but which I think is pretty crucial.
> Included in the rules' scope will be platforms with a market capitalization of €75 billion or turnover in the European Economic Area equal to or above €7.5 billion. [0]
> imagine WhatsApp was written and maintained by an individual? Would we be so keen to use terms like "force"? This is all negative in the freedom dimension.
These rules only apply to platforms with a market cap of over €75 billion or European Economic Area turnover of over €7.5 billion.[0] No one is proposing that we require single developers work with Apple and Facebook to make their apps interoperable.
This (admittedly terrible, but now rectified) location flaw aside, what safety disadvantages does Telegram have over other communication platforms without end-to-end encryption like Discord, Teams, Slack, or Messenger?
> Telegram stores all your contacts, groups, media, and every message you've ever sent or received *in plaintext* on their servers. [emphasis mine]
This implies they don’t use encryption “at rest” - unless I’ve missed something in their FAQ[0] (entirely possible, I’m far from an expert on cryptography), they seem to imply they do.
If it is indeed the case that they don’t encrypt data at rest, I can definitely see how that would be a problem.
If data is encrypted at rest though, I don’t see how any of that is fundamentally different from the other messengers I listed in the parent - the server still holds the keys, and thus must be a trusted party - but it’s nothing new.
Even if they store the data encrypted on their servers and hold the keys - it is not different from plaintext.
There's another thing. Some years ago Russian FSB demanded encryption keys from telegram threatening to ban it in Russia, and publicly they refused to do that. But then somehow FSB has quietly dropped the case. Question is - why?
Ultimately what Moxie is doing here is disingenuous and an abuse of language to prop his argument. He could have just stated the facts but instead he's using propaganda to create fear in his audience. You can use correct language (messages are encrypted at rest) and still make the argument that Telegram does not use E2EE unless Secret Chats are turned on but he doesn't do that.
Really poor behavior from a leader in this space.
The Russian FSB dropped the case because there was no way to block Telegram without collateral damage and most of the Russian population uses it, including politicians. There's no need to get "shadowy council" here, especially in light of Durov's quite public support for the Euromaiden protests that got him in such trouble with VK.
If this is the first place your head goes, I don't know what to tell you except perhaps that this paranoia exhibited from the security community is often not rational, and frequently resorts to takes-no-prisoners stakes.
Here's an article [1] that goes over the attempts at blocking Telegram after the FSB demanded the encryption keys, was denied and the collateral damage that resulted from Roskomnadzor attempting to enforce that ban.
I ran this by my Russian friend and he confirms TJournal is reputable. However, he also cautioned believing a known propagandist, deputy Matveychev who made these claims. And toward the bottom of the article you sent:
"A source close to the creators of the messenger, however, doubted the deputy’s statement: when asked what Telegram thinks about Matveychev’s statement, he replied: 'Clowns.' This was reported in the online publication 'Durov's Code'."
It's difficult to believe that Durov who was driven from Russia and from his first company for refusing to hand over information on Euromaiden protestors would so jeopardize the trust he's built over the last decade by allowing hardware backdoors.
All this about him refusing to co-operate is just Durov's words. So you choose to trust him for some reason. But that's not how security works. Zero trust security model exists for a reason. Moxie is right, Telegram is not secure.
From how I understood it, they weren't able to properly block it. Or at least that is the official story. I'm skeptic about this whole ordeal though.
edit: Some article about it says[0]
> Russia on Thursday lifted a ban on the Telegram messaging app that had failed to stop the widely-used programme operating despite being in force for more than two years.
> Some Russian media cast the move as a capitulation, but communications watchdog Roskomnadzor said it had acted because the app’s Russian founder, Pavel Durov, was prepared to cooperate in combating terrorism and extremism on the platform.
> Even if they store the data encrypted on their servers and hold the keys - it is not different from plaintext.
That's the important point. Encryption at rest is little more than a marketing gimmick if the same entity also has the key.
Edit: also, Telegram is hoarding this data and nothing prevents them from using it for financial gain in the future. Or selling it/themselves to someone who does.
Better be safe and do not give this data to the intermediaries. Signal does the right thing here.
Carriers are not currently using the spectrum that they gained in Auction 107, because of the issues that this article mentions. Their agreement with the FAA has given Verizon and AT&T the go-ahead to begin operations on this spectrum tomorrow (outside of existing FCC-approved test sites, which have been operating for months).
Which spectrum are you referring to that carriers have purchased and not deployed, if not 3.7–3.9 GHz?
If you don't have a stable internet connection, that could be the reason - macOS sends a hash of every executable to Apple's servers before it is opened[0]. This caused a major issue at the end of 2020, when these servers stopped working and all macs stopped working unless disconnected from the internet[1].
macOS has also been updated so that syspolicyd bypasses VPNs and system firewalls like Little Snitch[2], so you can't easily block these connections now.
Your own links show that likely isn't parent's issue. It only sends a hash on the first run of an executable. I'm not saying the problem you're talking about isn't a problem or concerning, but it's very likely not the problem they're talking about.
I hate when people say "mine works," but here's an `ls` of my homedir showing it's not universally slow. I currently have an absolute garbage network connection.
> ls -G 0.00s user 0.00s system 64% cpu 0.010 total
There are also many, many other reasons it could be; some macOS specific and others that aren't--most importantly what they're seeing isn't universal. macOS often ships with very old GPL2 tools that can cause various problems (many people brew install updated GPL3 versions), people often have configurations that can slow down `ls` by multiple factors (colors, sorting, etc can each cause multiple queries to disk or require the listing to complete before displaying output), customizations causing a slow prompt, a slow or corrupt disk, listing a slow network drive, etc.
The VPN bypass was very quickly removed from macOS over a year ago [1]. So it would only be relevant if they were using a very old version of Big Sur.
I'll jump in here and say that it probably _is_ notarization. The issues arrise when osx thinks it can get a connection to ocsp but actually because of real world consequences it can't. This can cause a delay of upto 5 seconds while it times out.
Some specific examples,
No internet connection: instant fail over
Blocked OCSP firewall or whatever: instant fail over
Slow internet but still able to reach: slow start: 1+ seconds
Bad internet, not able to reach: 3-5 second delay waiting
Normal internet, OSCP reachable: <1 second delay
Disabled trustd: Nothing will start, single user mode and trustd restore required
I've experienced all of these and is one of the reasons I have a shiney new Framework laptop sitting waiting to be migrated over to.
Also the "only on first run" also isn't true. It periodically checks for certificate revocation (as it should) and therefore will cause issues at sporadic intervals.
And the kicker of course is that all this is via plain ol' http, so everyone knows what developer's programs you're starting via the hash.
For what it's worth, on most computers I've daily driven with Linux, tweaking the bios has been the norm. At minimum, I've had to disable secure boot on most distros.
I agree with the idea that it _should_ work decently without - but as of now, that is not the status quo.
[0] https://www.reddit.com/r/discordapp/comments/u4kn2k/alpha_12...