Imagine archeologists in 1000 years unearthing one of those and trying to figure out its purpose or cultural meaning… (and no, your data won’t survive)
I’ve seen several videos by reputable fixers that demonstrate empty NANDs working fine with DFU factory restore.
Same situation on Mn MacBooks.
Would be weird in the actual mass production process if the flash would need to be pre-programmed somehow; one DFU process IMO must be able to do everything needed.
After watching a couple of videos, that works with some older versions of DFU software and not new ones. Might be an arbitrary restriction by the DFU update software rather than the hardware. I'm sure they know this and work around it of course when doing these FLASH swaps.
Also if there are two flash chips they need to be installed in a certain order. Not sure of the rationale behind that precisely. I doubt it's a hardware difference.
While the post is great, terrifying, and seems to contain only true and verifiable information, I’m not sure what we expect.
„Normal“ people will not read this, nor be able to understand, nor gauge or grasp the impact. It’s become way to complex. We can’t simply stop using mentioned services anymore as a society.
Wouldn’t it be more reasonable to teach:
1. You have no privacy, it is impossible to ensure or guarantee privacy, and there’s no incentive at all for anyone to ensure privacy. (Scott McNeally of Sun said that already in the late 1990s)
2. There is no security and every kind of security has been, was designed to, or will be compromised.
3. All your digital information is already public or will become public at some point. (btw: Every top-tier consultancy operates under that assumption)
> „Normal“ people will not read this, nor be able to understand, nor gauge or grasp the impact.
Disagree. You don't need 10 years in IT to understand the meaning of: "M$ allowed customers to use their house-keys to open everyone's office safe, lied about it for 2 years, and still doesn't have a plan for fixing it".
McNeally was simply wrong, but despair is easier than fixing things, so a lot of people went with despair. The popularity of cloud and SaaS is the result. But this isn't a foretold destiny; just don't "trust" people you don't actually trust.
Those 3 points are only teaching despair. The more useful thing to teach is who we can blame, and how to reclaim actual privacy and security… even if it means using the dreaded regulation hammer.
None of which will change those three points practically.
For any bit of information, they may not apply, but if you assume they’re true you’ll:
1) not record information that is truly damaging in a damaging way (which is really good practice in general if you’ve got something to lose!)
2) have practical operational practices which do not rely on these being false - which is a really good idea if that actually matters (you have actual enemies somewhere).
3) you’ll focus on safety and building value in areas which are not mere information at rest, which is a good modern practice.
Osama Bin Laden already knew all this, which is why it took so long to find him. A decade or so. I guarantee you the CIA has been learning this with all their leaks. The FBI learned this this after COINTELPRO.
What is not written down can’t show up as a grainy photocopy in the New York Times, or a viral video from Wikileaks, or whatever.
What you’re talking about is a hammer to use to punish someone after a leak. But by then it’s far too late for anything actually valuable.
Necessary and important for ‘day to day’ stuff like bank account balances I guess, as long as you assume that they’ll be violated with little practical recourse if you have anything actually valuable in it.
Government regulation is what created and propped up Solar Winds.
I have to believe it's possible, but I have never seen any reasonable proposal for government regulation of infosec. Even disclosure requirements become bullshit and only harm everyone faster than they can get published.
While it's true that the best way to keep a secret is to keep it off the internet, regulation could absolutely improve the prospects of keeping secrets by requiring encryption in every context, imposing heavy penalties on companies that fail to properly secure sensitive data (much heavier than what we currently see, up to the corporate death penalty), and enshrining in law the people's right to strong encryption.
The best way to keep a secret is to never write it down, period. Or tell anyone.
If you do have to write it down (for practical reasons), it’s best to assume it will be leaked eventually and write it down with that in mind.
Even better, is in your operational assumptions, assume it will then be leaked shortly afterwards and build in ways to work around that.
So for instance - key material should have easy ways to be revoked, rotated, etc.
Operational rules should be easy to update/push new versions, etc.
Authentication shouldn’t rely on parroting a well known value (SSN, a plaintext shared secret, a biometric, etc.), and should be easily changeable/rotatable.
Most of these we’ve been steadily baking into our day to day lives anyway.
What you’re talking about is necessary, but insufficient for anyone who has a secret they actually need to keep. At least in the modern world. None of those penalties are ever likely to actually occur either, because no one wants to pay them. And they know they will end up paying them at some point, because anything else is just not how the world works.
For classified top secret information all those rules apply in some form, yet we’ve had numerous high profile leaks of TS information for years. The intelligence apparatus has done everything they can to destroy said leakers, but with limited success - and those secrets are still out there.
And that is without financial incentive!
That’s all. Most folks won’t have those kinds of secrets thankfully! And when they do, they usually just don’t tell anyone.
WTF?
I would only expect this view from an organization pushing for total transparency (like advertisement industry or national security) or from somebody brainwashed by them. There is no need for such despair yet.
All of the points are not true I think:
1. People can still have guaranteed privacy (e.g. going into the woods with no devices). As with many laws an incentive to ensure privacy of others could be punishment in case of failure.
2. There is no absolute security, but there is security against certain threat models.
3. Why would data I keep on a device that is not connected to any network ever get public?
>While the post is great, terrifying, and seems to contain only true and verifiable information, I’m not sure what we expect.
Well we expect people and corporations to fix a problem when confronted with it. That is what we expect.
> „Normal“ people will not read this, nor be able to understand, nor gauge or grasp the impact. It’s become way to complex. We can’t simply stop using mentioned services anymore as a society.
Have to give you a pass on "normal" people. I don't know any. I see no reason why we cannot go without the (by the way) unmentioned services or why we cannot change them to be more privacy conscious.
>Wouldn’t it be more reasonable to teach:
No it would be more reasonable to teach that privacy is vitally important to have a functioning society and economy. Anyone claiming different think they can exploit the information disparity between you and them to make money in the short term.
>1. You have no privacy, it is impossible to ensure or guarantee privacy, and there’s no incentive at all for anyone to ensure privacy. (Scott McNeally of Sun said that already in the late 1990s).
Well I respect Scott, but this is not his great moment. Let's change this to be still completely true: You have no property, it is impossible to ensure or guarantee property and there's no incentive at all for anyone to ensure property. Well we did find a way to actually do ensure property. It is called the law (and a government to enforce it). Just an idea to use this tried and tested concept on privacy as well.
>2. There is no security and every kind of security has been, was designed to, or will be compromised.
First this has always been true. Every lock can be picked. Fortunately not everyone can pick a lock. That is the reason why most of us still lock the door.
>3. All your digital information is already public or will become public at some point. (btw: Every top-tier consultancy operates under that assumption)
You mean those top-tier consutancy firms mentiond in this book: "The Big Con" by Muzzucato and Collington, Penguin, 2023? I can see that they sell the assumption, but they are not operating by it. If that were true McKinsey for example would have known their advice to Purdue Pharma would become public and they would lose big on it.
In short people who claim privacy is not important mean: _your privacy_ is not important and they are overly confident they can keep ahead of the information disparity to keep themselves private. See how hard, ironically, Google is working to keep all their information private in a public anti-trust trail.
This is abismal advise (and potentially self-serving advise, if you work in the industry) to give. As ever, there are nuances; "only a Sith speaks in absolutes" and all that.
#1. You have no privacy ONLINE. Providers have perverse incentives to sell you out down the river. Therefore, you DEFEND yourself by keeping a shallow online presence. If you are a casual user, you keep as little information online, specially in social media, as possible. If you need an online presence, you ASSESS the risks and pay time and money to MITIGATE those risks. If you don't see a Return-Of-Investment on those mitigation efforts, chances are you have been CONNED into thinking you need an online presence, but you probably DONT.
#2. There is no ABSOLUTE security. All possible defense measure CAN be circumvented, not not necessarily WILL be circumvented. You ASSESS as many risks as you can imagine, and MITIGATE only those where you expect a positive ROI. The ones you don't mitigate, you ASSUME. The ones you cannot afford to assume, you DO NOT TAKE by refusing to use the system.
#2.a Corollary to #2. If you take ZERO risk management, you still have a BASELINE level of security based on the risk-reward analysis by the criminogenic/sociopath portion of the population; they will not attempt an invasion if they do not expect to get away with it, or to gain something out of it. The more cynical people in the know claim there's no security, the more this baseline approaches zero and the more vulnerable the general population is.
#2.b Even if you are not part of the general population, the lower the BASELINE, the more time and money you PERSONALLY have to invest in risk management to achieve a bearable level of safety. Cynicism is costing US time and money, pal; don't pee/shit on the village's wheel just because it looks edgy!!!
#3. All your CURRENT digital information is already public or will become public AT SOME POINT. You can do better and pick the technologies that will push that point FURTHER into the FUTURE. And for not yet digitalized information, you may make conscious decisions whether the convenience is worth the risk.
I think a new approach to privacy is likely around the corner. Why have one conversation with somebody when you can have as many as you want all at once?
There were already addons like that that created garbage traffic a while ago. Just wasnt practical without language networks.
I keep my secrets in a safe with an old school lock.
My elderly aunt keeps her secrets on a notepad in her desk. I suppose a spy or a housecleaner (if she had one) could know her secrets but it won't be "hacked".
The whole "you have no privacy or no security" is false and only impacts the terminally online.
Do what the intelligence agencies do. Stop letting other people store your secrets. Put them in a nice heavy locking box. Guard them with a firearm.
I think that would be a bit simplistic - a burglar who specifically wants your personal digital secrets could put a hidden camera on your ceiling, a bug between your PC and USB keyboard, or just hold you hostage for it! Having a safe is pretty useful, but is neither a guarantee of security nor strictly necessary.
Having a firearm only works as protection if (A) you are present and armed 24/7 to protect your safe, (B) you are actually willing to shoot and (C) capable of doing so better than your assailant.
In a business context, if the company is large enough, it might well be worth hiring day-and-night security guards and heavy steel safes. But for the average PC user, the security can be improved much more effectively with simple improvements like creating passwords with 'diceware' or using separate accounts for financial tasks.
Almost no data breaches are targeted at a single user.
The value of your personal info individually is $1? Maybe $4?
If you can hit someone who has 100k records, hey that's a solid payday.
But no thief is gonna go break into a safe, risk being shot by an angry homeowner, or kick off targeted attacks over.. $4. Even your flatscreen tv is worth more and is MUCH easier to steal.
Almost all adversaries don't care about a specific target. They want an easy target. A safe + upset well armed owner is not an easy target.
I think the headline is stressing the wrong thing.
Whilst it's true that most available cars[1] have an upper charging limit well below the common charging station max of 250kW, not all charging stations are created equal.
There's a lot of smaller charging points that have to be small because the grid can't supply 250kW so you only get 50kW or 9kW.
Even if you do have the grid capacity to supply a few high current points you might be able to double the number with localised energy storage. Flattening that peak demand when all points start charging the same time.
Right now there's a lot of contention and queuing for those high capacity stations at certain times of day.
As long as the cost of an energy storage system, regardless of how it works, is lower than the cost of expanding the grid locally it's a viable option.
1. e.g. Audi eTron can only charge at a max of 150kW and the rate drops off as the battery fills.
reply