If you're wondering about that one from few weeks ago. One person from Intel noticed the article and confirmed the kernel bug, they ran some of the kernel benchmarks on their 24 core CPU and found up to 15% improvements when fixed (most difference on scheduler fifo benchmark, though 0% difference on most benchmarks).
I've ran the same on my computer at home and I also got double digit percent difference on a 32 cores AMD.
I thought the basic premise was wrong that the Kernel only uses a maximum of 8 cores. It was rather some scheduling logic that was implemented with a maximum of 8 cores in mind. So the author may have been right that something was wrong, but also wrong about the details (like in this article)
Sorry about that. It's a wordpress website on the official wordpress.com hosting, it's actually wordpress that is doing all this tracking out-of-the-box.
I've been trying to find a legal page with legal contact for GitHub since the topic was opened, with no success.
The only contact information I can find is this email for privacy requests, which should be good enough, they have to process legal requests they receive privacy@github.com
It's not clear from your messages whether you are a subscriber or your organization is a subscriber or both or neither. This affects how to access support and escalate and what claims you may have (your company should have a contract with access to enterprise support if they are customers).
It's not reasonable for GitHub to ban you with no justification and no recourse and make you lose your job.
Get a lawyer yourself. Or get your company to escalate through their support channel or legal.
Warning: We only have one side of the story. If you were posting abusive messages to Github in your name and/or in the company name, on company time. The company may review the messages and may find them abusive too and may fire you.
Some messages could be viewed as flame/kind of abusive in the sense of criticizing a msft open source project, and they were absolutely not made in the company’s name. Obviously I deeply regret any trouble they could have caused.
Msft said it was a GitHub decision, they can’t interfere.
The case was a person who was arrested for drug possession and trafficking, they were requested to give their passcode to unlock 2 phones allegedly used for trafficking, they refused then were further charged for not giving their password.
1) 15th May 2018 - First court ruled on drug trafficking but rejected the charges for not giving the passcode to unlock the phone, considering that a screen passcode is not a cryptographic mean to make the data on the phone unreadable or inaccessible.
2) 11th July 2019 - Escalated to the court of Appeal, same result.
3) 13th October 2020 - Escalated to the cour de cassation, who ruled that the law was incorrectly applied and sent back the case to the court. The cour de cassation doesn't rule cases, it only rules on whether a specific law was correctly applied by the court. (A decision of the court de cassation, like this one, explains how a law is meant to be interpreted and applied by the courts).
4) 20th April 2021 - The court of Appeal, repeated the initial result (home screen passcode is not a cryptographic mean to protect data) and dismissed the charges AGAIN.
5) Yesterday - Escalated to the cour de cassation AGAIN, who ruled that the law was incorrectly applied AGAIN, and sent back the case to the court AGAIN.
6) Future - This is pending another trial, from the court of appeal.
My understanding of the cour de cassation explanations, the home screen may or may not constitute a cryptographic mean to make the data unreadable or inaccessible, that depends on the phone. The court needs to rule on whether it is for that specific phone in that specific case.
For the HN audience who is technical and some of you actually make the phones. Most modern phones including all Apple and most Android have cryptographic means to protect all the data on the phone, it's effectively not possible to access contacts, messages, photos, storage, etc without having the home screen password. (Please consider that historically, it was often possible to take out the sim card or the storage SD card or use other tools to read the content of the phone, but not anymore)
My understanding is that the next ruling will have to consider whether these technical protections render the data inaccessible to the police. If yes and the data is deemed required for a criminal investigation, the suspect is required by law to disclose their passcode, or risk up to 3 year of prison and 270 000 euros.
Wait, is refusing to give up your encryption keys actually a crime in France (not only the UK)? I thought (though it’s been several years since I’ve looked that up) it was only an aggravating circumstance if the encrypted material in question has been used to commit a different crime and you have been convicted of that.
It can be, under the article 434-15-2 that is about that. The decision today is an explanation to French courts about how to interpret and apply this law. The context is a person formally arrested for drug possession and trafficking, who was formally requested under this law to unlock their phones (allegedly used for drug trafficking) and refused.
Rough quick translation: "Is punished by 3 years of prison and 270 000 euros fine, the action, for whoever has the knowledge of the secret means to decrypt cryptographic means likely to have been used to prepare, facilitate or carry out a crime, to refuse to submit said ways to authorities or apply them, upon official request under II and III of criminal code.
If refused, and providing or applying said means would have allowed to prevent a crime or reduce harm, punishment is increased to 5 years and 450 000 euro fines".
French: "Est puni de trois ans d'emprisonnement et de 270 000 € d'amende le fait, pour quiconque ayant connaissance de la convention secrète de déchiffrement d’un moyen de cryptologie susceptible d'avoir été utilisé pour préparer, faciliter ou commettre un crime ou un délit, de refuser de remettre ladite convention aux autorités judiciaires ou de la mettre en oeuvre, sur les réquisitions de ces autorités délivrées en application des titres II et III du livre Ier du code de procédure pénale.
Si le refus est opposé alors que la remise ou la mise en oeuvre de la convention aurait permis d'éviter la commission d'un crime ou d'un délit ou d'en limiter les effets, la peine est portée à cinq ans d'emprisonnement et à 450 000 € d'amende."
> the lower court (Cour d'Appel) ruled that the passcode is not a "cryptographic convention" (which both the Algorithm and Private Key would classify as), and consequently that the person is not guilty.
> The general prosecutor, not happy with this verdict, appealed to the higher court (Cour de Cassation), arguing that the lower court violated the law by insufficiently researching IF on the concerned iPhone 4, does the passcode is a "cryptographic convention"
Because when a Cour d'Appel applies a law, in this case, without not even research if this specific law is applicable to this specific element, it can be broken by the high court.
The Cour d'Appel did not even have to be "right" or sufficiently technically competent.
The Cour d'Appel only had to declare that it researched IF on this phone, the passcode was a "cryptographic convention".
If the Cour d'Appel declared such a thing, EVEN IF IT WERE BLATANTLY FALSE (I'm not arguing myself for the correctness here of this statement), then the Cour d'Appel would be deemed to have stated its sovereign judgment on this matter.
On such a task, The Cour d'Appel could not be overridden by the higher Cour de Cassation.
(the Cour de Cassation cannot re-evaluate the sobering judgment of the Cour d'Appel).
BUT, the Cour d'Appel intended to apply the "refusing to yield the cryptographic convention == bad" law, without even researching IF beforehand this was REALLY a "cryptographic convention".
The general prosecutor leveraged this oversight by asking the Cour de Cassation to break the lower court jugement.
He won. The Cour de Cassation break the lower court ruling, and sent them back to court again.
The break ruling is :
> By affirming that the passcode is not a "cryptographic convention", WITHOUT analysing the technical characteristics of the concerned iPhone4, yet essential to figure out a decision, the lower court insufficiently justified its decision
==== What I have to say on this matter
It's an old iPhone. I'm a bit lazy to Google what's the passcode is doing on the range of iOS versions supported on such an old phone.
A 4-8 digits passcode is not enough not be secure. That's weak as hell.
That's only 10^8 possibilities, and the Private Key can be brute-forced in 1 second.
Still, IF on this old iPhone the weak-as-hell passcode was the Private Key of encrypted data, then it could be deemed a "cryptographic convention", and the person could be deemeded guilty.
On a RECENT iPhone, I think that this person could escape being guilty for not giving its homescreeen password or code.
On RECENT iPhone, those weak (4-8 digits) are NOT part of a "convention de déchiffrement"
The passcode is neither the crypto algorithm, nor the Private Key to the data.
on recent iPhone, the password is ONLY a key to a safe : the Secure Enclave (T2 chip).
The Secure Enclave, even in rescue mode, has an API, and only accepts ~10 passcode attempts.
When you succeed, you are giving a mean to decipher data. I don't even know if :
- the Secure Enclave yields back the Private Key
- or just provides an hardware API to further decrypt data.
What I mean is that on recent iPhone, the passcode is NOT part of the "cryptographic convention".
It only unlocks a safe : the Secure Enclave.
That would be the same thing as storing the Private Key in a safe.
On iPhone4, probably the passcode IS used as a seed to regenerate the Private Key, and as such refusing to give it to police is breaching the law.
On iPhone with Secure Enclave + T2, probably the passcode is not used as a seed, because that would be weak as hell. refusing to give it to police is possibly not a breach of law.
> That would be the same thing as storing the Private Key in a safe.
Same thing with LUKS. Password just unlocks the encrypted master key stored on the disk, which is then used to decrypt actual data.
Not sure why this one layer of indirection would matter to purpose of the law.
If you erase the LUKS header (by some tamper detection mechanism), then you will not be able to provide any means to decrypting the actual data, even if you give up the password. That may matter to the law, since nothing it does may ever yield the decrypted data.
But this same effect can be achieved with direct password->key transformation. Tamper detection can erase the data itself instead of the master key.
Your example of 10^8 combination, a 8 digit passcode on an iPhone 4, means a policeman would have to sit on a desk and try combinations for hundreds of years. This is likely to be determined as unbreakable protection by a court.
The document in page 11-12 goes into what may constitute cryptographic conventions.
It considers all recent iPhone and Android phones to be. It considers all systems for unlocking a mobile phone to be, as there is no other ways to access data on the phone otherwise, given normal technical knowledge and no specific software or hardware.
> Your example of 10^8 combination, a 8 digit passcode on an iPhone 4, means a policeman would have to sit on a desk and try combinations for hundreds of years. This is likely to be determined as unbreakable protection by a court.
I was the first employee of HRT in London in my niche skills. The company never granted holidays to UK employees. The issue has escalated to impossible levels with the company terminating me and openly threatening me and my family to cover up.
There is no recourse. I am in the UK, the company is a billion dollar company in the US. It's not playing by the same rules. The only option I have left is to make a public disclosure.
I found my first job in the UK thanks to Hacker News. This is my blog, you can see my history and comments for half a decade. It's real.
P.S. Don't expect further comments from me, I am relying on British law to make an official public disclosure, I don't think it allows to engage in open discussions.
It's quite likely that one of the products takes over. Maybe one product had more traction with customers, or maybe one product is losing customers. Maybe the entire sales or development team will leave within a year, leaving one product in shambles. It's quite possible that one organization will be torn apart, who knows the terms of the acquisition.
Sometimes two products are there to stay - just ask BlueYonder, SAP, Korber or plenty of other large enterprise software vendors which end up with two products (be it merchandising systems, ERPs or WMS products they have ended up with via acquisition).
That's true, though more often the case with products that have a lot of overlap, versus really being competitors in the same market. Perhaps Trello is a good example. It overlaps with Jira, but it doesn't really compete for the exact same customers in an either/or decision.
It's hard to find a good example where a merger of two products with very similar markets where both products thrived after the merger.
Rent in London is high but little compared to the other international cities. My experience working for international companies is that a flat that goes for £2000 for me in London can go for $5000 for a coworker in NYC (maybe less now after the pandemic).
A factor of 7 is entirely believable for England (non London) and most of the EU.
Personally I view it as a successful project for the client.
The client wanted something done but they only had some time/budget to work on it. The developer looked into it and was able to get the project done. It's great.
I've ran the same on my computer at home and I also got double digit percent difference on a 32 cores AMD.