I really wish Charles generated a self-signed cert on installation, rather than using a root that you download from them.
I suspect that a lot of people in the iOS dev community (where Charles seems ubiquitous) walk around with the Charles root on their phone, ripe for an easy malicious MITM against them.
I'm also surprised more iOS and Android apps don't bundle and pin their certificates - it's still an obscurity measure since, worst comes to worst, the user can root / jailbreak the device and attach a debugger or watch the network stack, but it keeps any random user with Charles (or random malicious attacker with a stolen root cert) from reversing private APIs.
Relevant Quote:
"Every Fiddler root certificate is uniquely generated, per user, per machine. No two Fiddler installations have the same root certificate. The only way for a Fiddler user to be “spoofed” by a bad guy is if that bad guy already is running code inside the user’s account (which means you’d already be pwned anyway)."
Every other product I can think of (Proxy.app, MITMproxy, and so on) also uses self-signed certificates generated at startup / configuration time rather than a shared root - I think that design is unique to Charles.
Charles does allow you to use your own certificate, but it's not the default user flow.
I see the ease-of-use case for the way Charles does it, but the shared-certificate approach is so insecure (you're basically handing the keys to all of your unpinned SSL traffic to anyone on the Internet) that I wish it would go away.
I also really like Fiddler and the warnings it provides are excellent. Sadly, it doesn't really support OSX yet so many iOS developers can't use it.
I also found it hard to set up Charles to use a custom cert, definitely had to read the instructions on their site instead of trying to figure it out on my own.
Sure it's a little bit extra work but not much. If someone is reversing your app chances are they are using a jailbroken device anyways to extract the unencrypted IPA or to attach gdb to your app.
I've used MITMproxy for this. It's great fun to MITM every app on your phone, you'll find some really interesting stuff. I happened across a WSDL in the British Airways app that appeared to enable booking flights as if done by a member of staff back of house. It might have been possible to book flights and not pay for them, obviously I didn't go as far to test this. But like I say, interesting stuff.
Tip: Some apps do SSL pinning so the handshake will fail with the cert that your proxy provides. You can disable any kind of SSL cert checks on a jailbroken iOS device with SSL kill switch (https://github.com/iSECPartners/ios-ssl-kill-switch)
Never seen MITMproxy - how does it compare to Charles? I've found super interesting stuff - pretty much invariably find that (a) there's cool stuff you can do and (b) the app's developers haven't gone to significant lengths to stop you doing it.
MITMproxy is much more flexible--but steeper learning curve. Very command line-y versus Charlesproxy. I'd use charlesproxy for casually look at apps, and MITM if you needed more advanced stuff.
There is zero incentive for me to do that and whole lot of incentive to not do it. I don't fancy the chances of trumped up charges under the Computer Misuse Act if they take it the wrong way.
Just taken a look. I agree there's some stuff in there they probably don't want you or I to know, but no sign of ability to book from what I've discovered.
Yep, Charles is a great proxy. Fiddler for Windows is a great free option as well.
Warning: the author of this blog very nonchalantly instructs readers to install the Charles certificate. If readers don't know what this does, it can be quite dangerous. Next time your device connects to a Wifi network that you don't control, you could very well be going through somebody else's proxy and have all of your https traffic sniffed!
Better to sign your own certificate and use that instead! Or, at least uninstall the Charles cert from your device after you've had your fun sniffing traffic on your own network.
no problemo! I enjoyed the post - I do this kind of thing daily for my job, which just so happens to require that I "understand" how many private APIs work ;)
I would add, sometimes the "private API" that you would like to debug is the one that you are creating. I have found Charles to be essential whenever I'm asked to enable CORS so that browsers can make cross-domain Ajax calls to the server where I'm creating software. For complex requests (and any PUT or DELETE) the browsers will do a "preflight" OPTIONS request to see if the request is allowed. Strangely enough, the CORS spec actually encourages browser-makers to hide (from the Javascript client) the OPTIONS request. FireFox hides the OPTIONS request completely -- and so this is one of the few times that FireBug failed me. FireBug never sees the request, so it is unable to tell me about it. Getting CORS right usually means seeing what is in the OPTIONS request, and what your own response is (I mean, the response of the server software that you are writing), and I found Charles extremely useful for that bit of debugging. (cURL will also fail you in this case, as cURL is not limited the way Ajax requests are limited -- the cURL request will always work, so it won't tell you why your Ajax call is failing.)
If you are building an app and want to protect against this kind of thing, you can use SSL pinning. This means that you hard-code your API server's public SSL key into your app and explicitly reject and other public keys, even if they are installed of trusted by the OS. Incidentally, the twitter iOS app does this — you can't observe its traffic via Charles or other snoopers.
Also, you're totally screwed if you need to reissue your SSL cert for a security problem (think heartbleed). You'd have to reissue the cert, wait for apple to approve the update, then hope that a significant proportion of your users actually update your app.
What I do is I generate an inhouse CA for uses like these and hardcode its certificate into the clients: the key used by the API server can then be changed whenever necessary, just as if it were signed by a commercial CA.
Would you be able to expand on how you generate an inhouse CA? From Googling I don't think you're becoming an intermediate CA as that seems extremely costly! I am in the situation where I have a pinned cert in my app and I will have to address it's expiration in the future!
Since you are hard-coding the CA, it doesn't have to be an actual intermediate CA that is trusted by anyone except for your app (that has it hard-coded.)
But as people say, they could easily edit your binary to change the CA, or disable the CA check entirely, so, like any DRM system, you can't keep the protocol secret.
It can be done using openssl (the documentation is a bit opaque but it's not hard to do). As Robin_Message says, I'm not becoming an intermediate CA — I'm generating my own root CA, which no one except me and my apps trust. In turn, my apps contain only my own CA as a trust root. There really isn't an advantage to using a commercial CA here, beyond the fact that their web portal is probably easier to use than the openssl command line.
I typically use Fiddler (Windows) for this. Dig the article. Very clear steps on how to get setup and start playing around.
I remember a similar article about how the author intercepted the API requests for CandyCrush and was able to give himself lives and whatnot. Pretty neat.
I sent a friend of mine about 2000 Yo's (Charles will show you that the Yo app uses Parse...very simple API requests) with Charles's request-replay feature - quite easy to do and oh so funny...or at least I thought so.
Wow, didn't realize. That said, it's functionality is very limted, requires an API key, requires people to subscribe to you, and likely has rate limits - none of which are problems if you just pose at the Yo app and use private APIs.
Thanks! I think this is a super interesting thing to do - there's tonnes of great stuff to discover - and something that's not very well documented online. I wanted to make it easy for anyone to do this kind of thing.
I like Charles and have been using it for this exact purpose for a while now. I'm still wondering what's the best approach from a developer perspective to get around that and keep a private API private. Even if you build a different key for each request on Android you can work your way through obfuscated code and rebuild the logic. If the app requires the user to login running your own OAuth server can be a solution but are there any easier solutions?
The short answer is "no" - you're always playing a cat and mouse game, so you're wiser not to put things behind an API that you're really not happy for people to play with.
I suspect this is why we've traditionally seen banks (in the UK, at least) use web-pages-embedded-in-apps rather than true native apps.
You should expect APIs to be public. For a sufficiently popular service, you should expect people to utterly replace (emulate or rewrite) your fine client and do things that you don't expect, which means implementing things like rate limiting and blacklisting on your server, where you can control things.
I get that some developers want to protect what they consider a private API, but consider it from the other side.
As a user, I think I have the right to know that you're not secretly uploading my contact list to your servers. Anything you do to block that prohibits me from trusting your app.
There must be some kind of happy medium where I can protect myself from malicious apps, and developers can protect themselves from malicious users too.
I love how the article's closing remark is about releasing source-code, yet the entire rest of the article talks about a closed-source and proprietary program. ;)
Pretty cool idea, though. I reckon this is doable with something like mitmproxy[0], which is open-source, and it would certainly be interesting to poke around in some of these hidden APIs.
Lawsuit on basis of what? IANAL, but I've heard many times that ToSes/EULAs which forbid you from querying their servers and reverse engineering their apps aren't legally binding in most jurisdictions.
A lawsuit would likely be based on copyright/database rights infringement if you are accessing a data source which the company only makes available under specific conditions which you are bypassing.
Craigslist v Padmapper/3-Taps in the US is a slightly analogous case albeit with 3-Taps scraping rather than bypassing restrictions on an API.
But - unless the data's included with the app - is that app author's responsibility? That sounds like suing Bram Cohen for illegal downloads over BitTorrent.
BitTorrent is completely neutral and just a mechanism for distributing content which can be contrasted with the others. Anywhere where there is a deliberate attempt to bypass restrictions on the availability of data then you may get into trouble. Even more so when you bypass then attempt to monetise like with 3Taps...
There's often more than one piece of software to do something and that's OK. I've hacked up simple MITM proxies myself more than once in the last 15 years.
Charles is not new, it's been around more than 10 years.
I think that generally saying 'what does X do that Y doesn't' comes across as fairly negative, and unless there's some obvious reason why Y should be the default adds little.
I'd never heard of burp, but I had heard of Charles. Is there some obvious reason why burp should be the default?
Never heard of either, but just used both and the free version of Burp seems complete enough and is free. Charles is restricted when free ($50 license) but looks better.
edit: additionally, Burp uses a custom certificate instead of a default one for all Charles users
Nothing really - I suspect that Charles is more approachable for the average person though. And besides, I hadn't spotted any alternatives when I wrote the article ;)
I suspect that a lot of people in the iOS dev community (where Charles seems ubiquitous) walk around with the Charles root on their phone, ripe for an easy malicious MITM against them.
I'm also surprised more iOS and Android apps don't bundle and pin their certificates - it's still an obscurity measure since, worst comes to worst, the user can root / jailbreak the device and attach a debugger or watch the network stack, but it keeps any random user with Charles (or random malicious attacker with a stolen root cert) from reversing private APIs.