Yes, this is definitely needed. There are two angles to investigate: checking the cryptography itself for mistakes, assuming good faith; and checking the packages for backdoors, if this is questioned. Given that the TrueCrypt developers are anonymous, both are required. I won't get much into the former - but I've thought a bit about the TrueCrypt foundation, and when I looked into it (not in great depth), it looked odd to me, so I gave some thought to where back doors would be located, if there were any.
If TrueCrypt is back-doored, the backdoors are likely only present in the binaries offered for download on truecrypt.org, not in the source code, where they would be more easily found. A cross-check of important routines might be informative. A back door would take one of two forms: either it'd smuggle a copy of the key somewhere, or it'd lower the key's entropy enough to be crackable. The former would be discovered by simple disk-space accounting, so it is probably not the strategy used. Reducing the key entropy would make any volume decryptable if it was first initialized by a backdoored copy of Truecrypt, while retaining compatibility with non-backdoored copies; so likely places are in the key-generation, or in the random number generator that feeds it.
Also noteworthy: the download links all work by POSTing to /dl, then being redirected. The Windows download link for me went to http://www.truecrypt.org/download/transient/e2ec88b9b7dfb3a8..., and it's not clear what that big hash is doing there - other operating systems use a different URL scheme (without the /transient/10bytes component). Their web server might occasionally give different binaries to people it doesn't like. All the downloads are over http, not https (except for signatures); and their site responds to https in a very odd way, responding with a valid certificate but always redirecting to non-https.
> not in the source code, where they would be more easily found.
Why do you say that? Compiling the source with the same compiler and flags, plus diffing the binaries would quickly show where the differences lie, and if they're hostile. Any half-decent reverse engineer could do this.
That would stand out more than if the source itself was backdoored in a non-glaring way. Open source has taught us that nobody ever reads the source.
Yes, that is correct. I founded SourceDNA.com as a way of automating this kind of analysis. We match components found in binaries in order to identify unlicensed use of third-party code, as well as security patches.
Tools like bindiff have been around for years and take advantage of the fact that compilers don't randomize code generation. Instead, the callgraph and control-flow graphs largely reflect the structure of the original source code. Once you have leverage by exact-matching the parts of the binary that are nearly identical, you can build up and down the tree of nodes to find those that have more changes.
Crypto backdoors can be unbelievably subtle though. A single branch condition, a bit that is flipped, etc. can all lead to catastrophic failures. For example, a compiler optimization for dead code elimination led to some zeroization of key material being skipped. This kind of thing is extremely difficult to find and requires a careful understanding of the underlying code.
I agree with you that most differences can be found, but understanding the ramifications of those differences requires extremely careful analysis. A crypto flaw does not stand out from a mis-optimization.
The compiler knows what memset does. It also knows that stack variables have no use after the function returns. Therefore, the compiler knows there is no reason to write zeroes to this memory, because the program will never read those zeroes. Hence, the compiler will delete the call to memset.
Oh I get it. You're saying the key is stored on the stack and then you can find it by inspecting memory if it hasn't been zeroed out. That's really interesting, what a great example of leaning on language implementation. I guess the correct way to write this is to malloc and free the key. Except, couldn't an attacker see the key anyway while it was live in memory, either on the stack or on the heap (or if it's not "heap", whatever you call the thing that malloc takes memory from)?
I'm not a specialist, but wouldn't a call to free simply deference the memory, but not zero it out? you can then probably still find the key by inspecting memory? again, I'm not a good C programmer, I would like to know too :)
Yeah, you'd still need to call memset before free. Compilers have a much harder time convincing themselves of things about pointers, so it should survive dead code elimination.
Oh I see. That is a lot of fiasco. I guess I'm mostly curious about a minimal example of useful code that a dead code eliminator will incorrectly optimize.
In the debian fiasco it's mostly a case of "manual code elimination" [1].
However there are plenty of examples of compilers aggressively removing code that causes undefined behaviour. Basically, when the compiler encounters UB, it can do whatever it wants to code that triggers the UB, which means possibly removing it. Compilers exploit this fact a lot because UBs happen a lot in "normal code". See here [2] for examples and explanations; I also can't help but mention John Regehr's blog [3] if you're interested in compilers, security, testing, safety.
That doesn't seem like a good fit. It wasn't a compiler problem, it wasn't dead code, and the problem wasn't skipping zeroing, it was too much zeroing!
That would require a deterministic/reproducible build process. Such things tend to need the same host-OS, library versions, dependency versions etc that AFAIK Truecrypt haven't got documented anywhere. See the efforts of Tor Project, Debian et al to start doing reproducible builds - It's not easy. And this is without things inherently producing non-deterministic builds - embedding timestamps is quite common for example.
Umm, I think the person above is actually correct, though I'm by no means an expert. Otherwise, why would it take such a huge effort to achieve a deterministic build process as tor seems to have recently done?
Given that the TrueCrypt developers are anonymous...
I hadn't realized this was the case. That's pretty interesting.
I'm actually surprised that more pro-crypto, pro-privacy types haven't disappeared behind internet pseudonyms. Certainly if I were running a project that I hoped would be useful for dissidents in an oppressive state, I'd do my best to run it as anonymously as I could manage.
There might be personal risks and project risks in revealing yourself, but very little to gain. Knowing the names of the developers might give people warm fuzzies but it probably wouldn't mean much as far as the true security of the software goes. If the name is "John Doe," that tells you nothing, and if it's "Phil Zimmermann" that still doesn't prove that the NSA hasn't forced him to compromise the stuff.
In the particular case of TrueCrypt, which accepts donations through PayPal and credit cards, the "anonymity" of the project members is likely to be pretty thin. I imagine an American prosecutor could have them found easily enough.
If you compiled the source and obtained exactly the same binary as is offered for download, would that prove the binary was unmolested? I'm guessing there are many reasons why the binary might turn out to be different for non-suspicious reasons (different compiler versions/flags/environment...?), in which case would it be hard to enumerate all those reasons and rule them all out as the cause of a difference?
Yes - they acknowledge this in the Readme.txt for the source:
At the end of each official .exe and
.sys file, there are embedded digital signatures and all related certificates
(i.e. all certificates in the relevant certification chain, such as the
certification authority certificates, CA-MS cross-certificate, and the
TrueCrypt Foundation certificate). Keep this in mind if you compile TrueCrypt
and compare your binaries with the official binaries. If your binaries are
unsigned, the sizes of the official binaries will usually be approximately
10 KB greater than sizes of your binaries (there may be further differences
if you use a different version of the compiler, or if you install a different
or no service pack for Visual Studio, or different hotfixes for it, or if you
use different versions of the required SDKs).
If you have verified the source code and the build process is sound and you produce the exact same binary as the distribution, then yes. But you're right that it's unlikely you'll produce _exactly_ the same binary. If you don't trust the source, the best way to be sure is to compile yourself.
There have been several important legal cases that would have significantly benefited from cracking the defendant's TrueCrypt volumes. To my knowledge, these attempts have all been unsuccessful.
If the NSA has a backdoor, then they're not sharing it with other law enforcement agencies, and keeping it a better secret than everything else they were up to.
I would assume the NSA has a variety of backdoors they do not share with other law enforcement agencies.
It was actually surprising to me that they shared ANY of them.
This is how intelligence agencies work -- they want to keep it a secret what they can crack, so the adversary will keep on using it, and they can keep on cracking it. For intelligence purposes.
Intelligence is different from law enforcement. Or at least, it used to be. In the US, theoretically the constitution means you can't use secret evidence against someone in court. But of course you can use secretly gathered intelligence for polticial/military/geopolitical purposes. The merging of intelligence and law enforcement is not unrelated to the push to keep more and more legal proceedings secret -- a significant threat to the constitution.
So what would you trust your life to? If you are a dissident, what products and procedures would security professionals suggest?
They'd be hard pressed to recommend any off-the-shelf products or practices right now? Because PKI is in shreds, nobody trusts trucrpyt, and TOR users have been rooted left right and centre recently.
With my life? If a security vulnerability would be life threatening, I would not trust anything on the internet, I'd keep it off the internet -- and maybe even keep it off computers entirely. (Although if they can get physical access to my non-internet computer, they can get physical access to hard copy notes too, of course).
Yeah, that's hella inconvenient. But not quite as inconvenient as being murdered or disappeared to a torture prison.
So how exactly would you communicate, share files and information if you were organizing an uprising for example?
You can't always physically transport those files, and sometimes it may be more dangerous to do so (if you are wanted person for example). You also have issue if you are having those files transported cross border then you may get border issues (Philip Greewalds partner being help in UK). For speed and ease, it would have to be via the internet somehow. For higher security, it would have to be non-network connected computers and via USB. For even higher security it would be non-computer but then I would have to memorise everything and I don't have a photographic memory :(
I'm not sure what I would do in this case either tbh.
They can always use the backdoor themselves and use the information decrypted to provide hints to local law enforcement on where to look for evidence via parallel construction
Like most mortals, I depend on Bruce Schneier: "Since I started working with Snowden's documents, I have been using GPG, Silent Circle, Tails, OTR, TrueCrypt, BleachBit, and a few other things I'm not going to write about. There's an undocumented encryption feature in my Password Safe program from the command line); I've been using that as well." http://www.theguardian.com/world/2013/sep/05/nsa-how-to-rema...
Matthew Green is the real deal. Truecrypt is extremely popular and not already well assessed. If you're wondering whether this could be helpful for ensuring end-user privacy, the answer is yes.
I couldn't make any pledge input work (just kept getting errors) but I'm matching Matthew Green's own pledge.
They are asking for money, not grandstanding. I believe in what they're trying to do and have a lot of confidence in them as professionals, so I have no trouble doing what they asked instead of what I might rather do.
If you want to offer talent, I won't accuse you of grandstanding. For my part, saying "we could help do the assessment" feels like grandstanding. They asked for money, and this seems like a worthwhile cause to donate to.
I'm a little confused. This link takes me to a fund to "help us find stephen martin's killer". Which is itself a duplicate of another fund on the same site. Searching google for: truecrypt audit fundfill ... provides a link that is titled "Fund: A public TrueCrypt Audit - Fundfill", but takes me to the same Stephen Martin page as well.
Is fundfill broken? (and if so, should I trust it with money?) Or is there a secret decoder ring that I'm missing?
[edit]
On further digging, I'm going to say "no , I should not trust it with my money". The 'funds' are a mix of 2/3 year old and current requests, mostly people asking for money in a kickstarter-like fashion, as opposed to the bounty system that appears to be the intent.
The site has various issues, and while their twitter account is active the whole thing just has an air of not-something-I'd-trust about it.
/opinion
[edit 2]
Come to think of it, wouldn't kickstarter or something similar be better? Get an estimate for the work and start a fund to get it done?
I'm the owner of fundfill, and yes, there is an issue with the site. We're working feverishly to get the issue resolved. It is preventing money from actually appearing in the fund. I'd ask anyone interested to register with the site, and I'll email everyone once this is working. We're in a pre-startup phase, so I can only beg your patience with this bug. Would a Bitcoin account satisfy any lingering doubts? If so, I'll set it up as soon as possible. Again, my apologies for the problems with the site - this is the most traffic we've had. You can contact me at jbalfantz [insert symbol here] fundfill. Or twitter @joebalfantz / @fundfill
There is nothing special about truecrypt formatted encrypted volume.The only thing interesting is the format of the header used to store information about the properties used to create the volume and necessary to open the it.
cryptsetup is a front end to dm-crypt,an infrastructure in linux kernel that deal with block device encryption.cryptsetup just parses truecrypt header for volume properties,the hard crypto stuff is done by the kernel.
tcplay does the same thing,it just parses the truecrypt volume header and the hard lifting is done by linux kernel in linux and bsb kernel in BSD systems.
In both two projects,the crypto stuff is done either by crypto routines in kernels or by libgcrypt or openssl.
zuluCrypt is just a front end to the two projects above.
None of these projects do crypto stuff themselves.
It should be possible and to some,"trivial" for windows or OSX tools that deal with block device encryption to support truecrypt format.I think this will be a better use of the resources.
it's not clear what this link is, but the page requests using the link above for public sharing. it might possibly be the source of errors people are having donating...
This problem was fixed yesterday with the site. Please try the link again. We're in a pre-startup phase and that was the most traffic we had. However, we were able to fix these issues within an hour of identifying each root cause (the different metadata on the fund and the pledging issue). Details of the problems can be found here: jbalfantz.wordpress.com/2013/10/10/what-happens-when-you-break-your-sites-daily-usage-record-by-10x/ I hope that anyone interested in opening up the curtain in front of TrueCrypt will visit the site again - I offer my apologies to everyone who had a lousy experience.
TrueCrypt is an awesome product and the value-add from a thorough, independent audit would be immense. I use it in conjunction with Dropbox to add my own security layer. Dropbox only uploads the delta changes in a truecrypt container even though it's encrypted.
I don't have to worry about bugs like these : http://techcrunch.com/2011/06/20/dropbox-security-bug-made-p... which can exist with 2FA too.
I'd love to have a heightened sense of trust in TC if it's independently reviewed.
No, it's not. Disk encryption products like TrueCrypt do not have the properties you expect from ordinary file encryption. Only a subset of blocks are modified. It's a tradeoff, but it's the only way to make disk encryption practical.
Just a simple question, leaving aside everything else: why in 2013 there is no public repository for TrueCrypt? How hard is it to have a public repository or at least a detailed change log of the changes they make between releases? How hard really?
To anyone from Fundfill.com: I've tried pledging and authenticating via FB connect in both FF and Chrome with errors rendered with every attempt. The login screen hangs (stays visible) and yet the upper right says I'm logged in.
In either case, I'm unable to pledge using FB connect. Please advise.
Would you please contact me via twitter (@joebalfantz or @fundfill) or email? I'd like to discuss your issues and your potential pledge. Kenn and Matt have been working very hard to get the draft for the TrueCrypt proposal ready, and I've been tracking down issues for the site and handling the website. We pushed another version 10 minutes ago that fixed some of the pledging issues, so if you're able to try again, please do so.
d0ne, we are experiencing issues, and I'm trying to get them resolved as son as possible. Please note my message above. If you've registered with the site, we have your email, and I'll reach out to everyone once the issues are corrected. I'll post an update as soon as it makes sense to do so - for now you can see the fund and add money into your account, but the pledging to the TrueCrypt fund is broken. I'll update twitter as well - @joebalfantz, @fundfill
UPDATE: All the major bugs found today have been fixed. Pledging to this fund should not be an issue anymore. We encourage anyone interested in helping move this project along to pledge $50 or $100. We have one pledge of $500 and another one on the way.
Only half? I never found another logical explanation for how it's sustained. But, as we know from the news, FBI does not have the keys to it's back door for even high profile cases. So it's probably really for national security level-stuff... Or held by some other nation.
Good point. What if we find out that it's someone like the Syrian Electronic Army that developed this and has now backdoored everyone! Could we have reasonably expected an announcement from the government telling us as such?
OK, what the hell? The OP link is now redirecting to something about finding someone's killer. I'm not opposed to that cause, but how exactly did that link get jacked? Was it on FundFill's side? That entire site looks like a spam site with very few entries: http://www.fundfill.com/funds
Not sure why this site would be used instead of IndieGogo
My apologies. The killer fund issue has been dealt with, and we're tackling another issue that has been preventing some pledges from going through. I understand the rationale to look at other sites that do crowdfunding. Fundfill is specifically focused on rewards and bounties, and therefore doesn't have a particular person who will automatically win the money, as kickstarter and indiegogo would. This is designed to encourage the the auditing of TrueCrypt. All problems with this have been on Fundfill's side, and the lack of previous funds is because we are currently in a pre-startup phase. I understand the concern about using a site that doesn't have thousands of users on a regular basis, but I ask your patience as today's experience is already improving the site for future use.
Sorry for the technical glitches. I'm too cynical, I guess: my first thought was that someone took advantage of the traffic spike and unilaterally directed it to a worthy cause.
Trust me, I understand the cynicism, and we deserve every critical comment we received here. The good news is we can only get better. :) After the experience of Arturas and the iphone touchid fake $10k pledge, we all have to be on alert when money and publicity are involved.
We recently switched over from Paypal to Stripe, due to Paypal's increasingly destructive behavior wrt integrating into a website. Thanks for pointing out the Paypal text, I was able to fix that this morning. Also, we added SSL this morning. There's an insecure image on the page right now giving Chrome browsers a problem but we're addressing that atm. egsec, I'd like you to come give the site another try. If not, please contact me on twitter - @joebalfantz - and we can discuss other, and potentially more transparent, means of pledging.
I'm not sure if that's what he meant. A 'focus' on UX is always a good idea. People find encryption scary and a project that makes crypto easy and accessible is important for it's adoption. Of course security is also important, I don't think anyone is suggesting weaker crypto is a good idea.
Truecrypt's UI/UX sucks at the moment. I'd love an alternative. I just discovered the CLI interfaced mentioned in another thread, which I intend to use over their GUI going forward.
The initiative is honorable, but I think this was a really bad choice of website to host the pledge. The copyright message on the footer is from last year, which may explain why many things seem broken or unfinished, as well as the errors other people here are reporting to be getting.
I like the idea of auditing TrueCrypt, but I don't see the rationale. Spend the time having everyone transition to GPG (or similar.) TrueCrypt will never be secure.
Interesting, from the document: "As remarked in this table the Windows version of TrueCrypt 7.0a deviates from the Linux version in that it fills the last 65024 bytes of the header with random values whereas the Linux version fills this with encrypted zero bytes. From the point of view of a security analysis the behavior of the Windows version is problematic. By an analysis of the decrypted header data it can't be distin- guished whether these are indeed random values or a second encryption of the master and XTS key with a back door password. From the analysis of the source code we could preclude that this is a back door. For the readability of the source code this duplication of code which does the same thing in slightly different ways was however a great impediment. It certainly must also hamper the maintainability of the code."
It is well-known that the CRC-32 algorithm doesn't meet cryptographic standards of a hash algorithm. Therefore cryptographic security can't be expected of the TrueCrypt keyfile algorithm. In fact, we discovered an attack by which any file can be manipulated so that it has no effect at all when added
as a keyfile. For this purpose we only need to append between 1524 and 2804 bytes to the file. There are many applications which don't mind the additional bytes at the end of such a manipulated file and present it's contents just as it was without those bytes. This is especially true for image viewers and image files (for example .jpg and .png files) and for PDF readers (such as the Adobe Acrobat Reader or Evince) and PDF documents.
...
It were three major weaknesses in the TrueCrypt keyfile algorithm which facilitated the attack. The
first weakness was the application of the cryptographically insecure CRC-32 algorithm. Its input
can easily be crafted to yield any desired output. The second weakness was the linearity of the
operations by which the pool bytes were changed, that is the additions. This made the attack a
simple application of linear algebra. The third weakness was the local nature of changes to the
pool. Every byte of the keyfile only changes four bytes of the pool. This enabled us to do the attack
successively making one byte of the pool after the other zero.
Holy crap. Any one of those weaknesses would be ridiculous from a security standpoint. The fact that all three exist is evidence of incompetence, at the very least. (Whether the incompetence is intentional or not is an open question.)
Not using CRC-32 is pretty much cryptography 101. Avoiding linearity is "preventing cryptanalysis 101". Avoiding local changes to the output (i.e. every byte of the keyfile should change the output completely) is called the avalanche effect and is, similarly, an incredibly basic requirement of cryptography. http://en.wikipedia.org/wiki/Avalanche_effect
Here is TrueCrypt's response to the attack:
“Hello,
First, thank you for reviewing our software–we really welcome and appreciate all inde-
pendent reviews. It should be noted that when we designed the keyfile processing
algorithm, we had been very well aware of the properties of the CRC-32 algorithm that
you mentioned in the document.
Below is our response to the attack you reported to us:
No matter what keyfile processing algorithm is used, if the attacker can modify the con-
tent of your keyfile before you create your volume, he can reduce the entropy/strength
of the keyfile. Just as you cannot allow an attacker to modify or prepare your password
when you create a volume, you cannot allow an attacker to modify or prepare your key-
file before you create your volume, either. Otherwise, if you do that, you cannot reason-
ably expect to have any security.
It is a basic security requirement that cryptographic keys (whether passwords, keyfiles,
or master keys) must be secret and unknown to attackers. Your attack violates this
requirement and is therefore invalid/bogus.
This is valid whether you use CRC-32 or HMAC-SHA-512, whether you use the True-
Crypt keyfile processing algorithm or something different.
Thank you again for taking the time to review our software. We hope that the mistakes
will be corrected before the review is published (publishing an invalid attack would be
misleading).
Sincerely,
David
PS - Even if the attack was valid (it is not) and a cryptographically secure hash (such
as HMAC-SHA-512) was used instead of CRC-32 (your suggested fix), the attacker
would still be able to 'crack' your keyfile by a comparatively short brute force attack (it
would only be slightly slower, as the attacker would merely have to find the correct key-
file among the thousands or millions of keyfiles he crafted).”
... and here is the paper's response to their response:
This response is mistaken. It fails to differentiate between the secrecy of a password or key and
the inability of an attacker to modify the password or key. The secrecy of the password and keyfile
is indeed a basic prerequisite for the security of a TrueCrypt volume as it is in any other crypto-
graphic application. However, it is quite possible that an attacker is able to modify a keyfile to a cer-
tain extent while that keyfile at the same time remains perfectly secret and unknown to him. The
point to be considered here is that cryptography has means to provide security even in such a
weird situation.
As an example for such a situation consider an attacker who distributes an image editing software.
He might add our attack algorithm to his software so that it secretly manipulates any image it pro-
cesses to have no effect as a TrueCrypt keyfile. Our algorithm is even small enough to run on
embedded systems. Therefore, even a digital camera might process the pictures it takes and out-
put manipulated picture files. The pictures would nevertheless show what the user wanted and
would remain unknown the the vendor of the camera or the image editing software. So they may
be kept secret although they have been modified by an attacker.
In that situation a keyfile algorithm based on HMAC-SHA-512 would provide perfect security while
the TrueCrypt keyfile algorithm fails completely. A brute force attack would be impossible as the
attacker has not to find the correct keyfile among millions of keyfiles he crafted but among a practi-
cally infinite number of possible pictures which might have been taken with his camera or pro-
cessed with his image editing software.
My general rule of thumb is: don't trust a cryptography tool that has a catchy name. TrueCrypt has always creeped me out for that reason. Or maybe it's the idea of a "downloadable" filesystem (ok it's not technically a filesystem) driver that thinks it can run on more than one operating system and still do a good job of it.
I've never seen what's wrong with dm-crypt. It just works, and is mostly transparent.
Cryptanalysis IMO is an on-going process and it needs to be done at every version. The funding will not last. True we need the first analysis, but who is there to fund the 2nd, 3rd, 4th, etc?
edit
I am not shooting down. Just thought we have to be explicit that this is not a one time thing. I do think once it is verified and studied, and people like it, the contribution will continue to come. We might be able to build new products out of truecrypt.
That's not the goal of the project. The bar isn't "spend this money and we can assure Truecrypt is strong in perpetuity". It is instead "Truecrypt is very popular and nobody knows the provenance or trustworthiness of any of its low-level crypto design, and that needs to be fixed".
The project makes more sense when you realize how untrustworthy that code might be today.
Yeah. I am not shooting down. I agree if this is a good software and people like it people will probably contribute. Just thought the process can't be stopped after its initial.
in theory, it doesn't need to be. you could formalize the definition of the TrueCrypt cryptographic protocol in cryptol[1] and then have a checker run as part of unit tests that verifies the source code is still a faithful implementation of the protocol...
I am not familiar with this, but an implementation flaw or bug could be an intentional backdoor. Can we automate checking process? That probably requires humans to audit the source code.
How big is TrueCrypt and how many contributions does it get? Certainly Linux kernel is so big and gets so many patches a day that even changeset analysis can be hard.
After reading through the comments in this thread, it seems as if Truecrypt cannot be trusted at this time. So what options do we have across different platforms that are open source, regularly publicly audited and the download binaries can be verified as being backdoor-free?
I should note that I'm on Windows as my home machine, so I'm personally most interested in that.
If TrueCrypt is back-doored, the backdoors are likely only present in the binaries offered for download on truecrypt.org, not in the source code, where they would be more easily found. A cross-check of important routines might be informative. A back door would take one of two forms: either it'd smuggle a copy of the key somewhere, or it'd lower the key's entropy enough to be crackable. The former would be discovered by simple disk-space accounting, so it is probably not the strategy used. Reducing the key entropy would make any volume decryptable if it was first initialized by a backdoored copy of Truecrypt, while retaining compatibility with non-backdoored copies; so likely places are in the key-generation, or in the random number generator that feeds it.
Also noteworthy: the download links all work by POSTing to /dl, then being redirected. The Windows download link for me went to http://www.truecrypt.org/download/transient/e2ec88b9b7dfb3a8..., and it's not clear what that big hash is doing there - other operating systems use a different URL scheme (without the /transient/10bytes component). Their web server might occasionally give different binaries to people it doesn't like. All the downloads are over http, not https (except for signatures); and their site responds to https in a very odd way, responding with a valid certificate but always redirecting to non-https.