Hacker News new | past | comments | ask | show | jobs | submit login
Grindr Shares Personal Information With Third-Parties (github.com/sintef-9012)
840 points by tjwds on April 2, 2018 | hide | past | favorite | 308 comments



Even within its confines, Grindr's data are rich for blackmail. (Consider: images and messages sent and received within 100 feet of Capitol Hill.) It was recently acquired by an offshore billionaire [1].

[1] https://www.bloomberg.com/news/articles/2016-01-12/china-tec...


Yikes. Combine this with the Chinese OPM hack, where extremely personal data for most government employees with security clearance has been stolen, and they have a blackmail goldmine on their hands.

https://en.wikipedia.org/wiki/Office_of_Personnel_Management...

They can verify what has been reported on sf-86 vs what might be found from this service. Any affairs or issues not reported are ripe for blackmail. Possibilities are endless as they say.


You know, maybe we'll actually see some legislation on personal information and privacy and how it must be handled over time and in acquisitions sooner than later. I Facebook's problems were promising in this respect, but with the additional disclosures happening so often, and especially in situations like this, we might see some real change for the better soon.

If so, I consider all those that had their Grindr info stolen or even just acquired by individuals that may have less moral qualms about selling it to those that would use it for ill patriots who suffered for us all.


> You know, maybe we'll actually see some legislation on personal information and privacy and how it must be handled over time and in acquisitions sooner than later.

Depending on how many influential people they can black mail already, it could be too late. A would be blackmailer isn't going to give up their method of control.


A would-be blackmailer isn't going to want competition. They've already got their material, and a law changing future exposure would have little to no effect on them, so get it while the getting's good, then close the door in the face of the next guy.


I feel like it's only a matter of time before we see some devastating compromise(s) of security using the OPM hack and/or hacks of private data sources.

When you look at some of the common methods intelligence adversaries use to ensnare assets (extortion, money, exploiting sympathies, stroking egos, revenge, etc.) the information in these hacks are a treasure trove the likes of which may never have been seen before. I can't imagine the various powers that be in the US Gov't are ready for the espionage threat posed by this..


> I feel like it's only a matter of time before we see some devastating compromise(s) of security using the OPM hack and/or hacks of private data sources.

I'm willing to bet that the devastating compromises thanks to the OPM hack have already happened, and that they're either classified, or, even scarier, the USG doesn't even know about them yet (e.g. people blackmailed into being spies who haven't yet been caught).


Yeah it is really frightening to think about. Cautious moles were hard to catch even before some of these breaches. There could be an exponential increase of compromised people in cleared environments and like you said, we might not even be aware yet.

And even then, our counter-espionage resources in all likelihood are not enough to address a massive increase in the threat.


We should look out for suicides. Someone who's given the choice between destroying his family and betraying his country might be pretty likely to off himself.


If the Chinese have compromising information, the NSA and/or CIA probably has the same and more. I think it's unlikely that we don't "know" about it, in fact could be using it as a way to turn compromised individuals into double agents.


Think of it as a business opportunity. Blackmail as a Service™!


The free market in full swing!


BMaaS


Uh, none of this would be on an SF86. You don't report affairs or dating on an SF86. They could still correlate membership and expose anyone who is private about their sexuality or hobby, but they won't be able to say "omg you didn't tell the government you banged Paul!"


You're right, that specific stuff is not on sf-86. But was pessimistic that more than just sf-86 was exfiltrated. From the wiki saw "that OPM systems containing information related to the background investigations of current, former, and prospective federal government employees, to include U.S. military personnel, and those for whom a federal background investigation was conducted, may have been exfiltrated". I read that as notes from people doing the background checks, which would include some of that kind of information.


Why is this a blackmail goldmine? I am ignorant of Grindr, but what I have heard it is basically Tinder but for gay men. Would Tinder be blackmail data? It's hook ups and relationships, dating, etc, no?


> but what I have heard it is basically Tinder but for gay men. Would Tinder be blackmail data? It's hook ups and relationships, dating, etc, no?

Absolutely when cross-referenced with information from OPM database. Employee with TS/SCI clearance working on a <military project X> hasn't reported reported his gay affairs (that his wife might not know about even). The Chinese find out, approach him and make him an offer "Look buddy, how 'bout we become friends. You just tell us what you know and we'll give you some money and most importantly keep quiet about your meeting with your secret partner on such and such dates or places?"


And that's why lack of blackmailability is one of the biggies on clearance applications.

Or, "Lets not become friends. You can tell my wife. She thought it'd be fun to do that!" aaaaand so much for that blackmail. And just don't forget to tell the security officer that someone tried funny shit.


Normally yeah, nothing wrong with the lifestyle as long as it is declared and is known and no fear of it being exposed. Except that the Chinese got their hands on the OPM database so they tell what is declared there and what isn't. If they didn't disclose these things there, they'd still hopefully run to the security officer and report but they might be think again since they'd be in hot water as well for lying before.


Being gay is still a sensitive subject in the U.S., moreso in many other parts of the world.

And I'd suspect that there are more than a few married (nominally heterosexual) men seeking out extramarital affairs on Grindr.


It is more than a sensitive subject in certain other countries, and I can only imagine the horror of someone who is in the US as a visitor of some sort being revealed to be a Grindr user.


You really can't imagine any scenario in which this information could be used for blackmail?


Makes me think that data should be treated differently in the case of a potential new owner, allowing users to opt-out from their data from being transferred; you may trust the leadership and governance, and may not trust whomever takes over or who will newly has access to the data.


Change of control is a huge problem when it comes to privacy sensitive data. That's also a big factor in why I have a problem with governments collecting data. You may trust the present day administration (or not...), but who knows what a future administration (or occupying force) is going to be up to (beside the obvious risk of a leak or a breach).

The same goes for situations like bankruptcies and 'bad leavers', there is absolutely no way short of airtight procedures and implementations to guard your privacy over the length the data resides with some company. And for most companies that means 'forever' so even the smallest chances tend to sooner or later materialize. In almost every company that I've looked at over the last couple of years (including medical, fintech and so on) there was always at least one person and usually more that had unfettered access to all the data, either in bulk or through some convenient interface.

I'm not even sure this can be fixed anymore, but I am also not going to give up.


> That's also a big factor in why I have a problem with governments collecting data. You may trust the present day administration (or not...), but who knows what a future administration (or occupying force) is going to be up to

The standard example is how the Netherlands were rather religiously tolerant, but used to keep track of Jews for tax collection purposes before WW2. Then when the Nazis arrived, they had all the records they needed to achieve their goals.

Who knows what will happen in the future, and how currently harmless data could be used by new actors then.


> The standard example is how the Netherlands were rather religiously tolerant, but used to keep track of Jews for tax collection purposes before WW2.

I wrote about that exact example:

https://jacquesmattheij.com/if-you-have-nothing-to-hide


Why did they keep those records? Different tax rates for different religions? Seems like none of their business.


I thought it was for this reason, though the sibling comment's link seems to indicate that we don't really know (and I trust him more than my memory).

It used to be common to tax different religions in a different way in some countries of Europe. Even today in Germany you are legally required to pay taxes to your church, and I have read multiple accounts of French expatriates who have discovered this only after a year of working in Germany, and getting significant amounts of back taxes to pay, as well as being required to be debaptised, etc, if they wanted to stop paying for that.

In France that would be unthinkable, but every country has its customs.


The way in which the German government has been co-opted by the church to act as their debt collectors is fairly disgusting and no longer of this day and age.


I don't agree that such a tax is a good idea, but consider the following:

1. If it bothers the Germans, why don't they change it? Is there some concordat that makes it difficult?

2. Do I really have a right to comment given that I'm not German? Does it really makes sense for you to apply what are presumably post-revolution, secular French standards to Germany?

3. "co-opted by the church to act as their debt collectors " -- sounds conspiratorial. You think the Church is all-powerful and able to do that? Tithing isn't new.

4. "disgusting and no longer of this day and age" -- appeals to time period are rather silly. We're not talking about fashion, and it's important not to fall prey to the Idol of Progress.

5. Interesting that more religious countries, like Germany's neighbor Poland, have no such tax.


> If it bothers the Germans, why don't they change it? Is there some concordat that makes it difficult?

The church is still very powerful in Germany.

> Do I really have a right to comment given that I'm not German?

You have a right to your opinion about what is happening in a country even if you are not living there.

> Does it really makes sense for you to apply what are presumably post-revolution, secular French standards to Germany?

It is fairly exceptional to see the state harnessed to do a private institution's bidding like this.

> "co-opted by the church to act as their debt collectors " -- sounds conspiratorial. You think the Church is all-powerful and able to do that? Tithing isn't new.

That it isn't new doesn't mean there is a place for it today. The state and the church should be firmly separated. In Germany this is - not yet - the case.

> "disgusting and no longer of this day and age" -- appeals to time period are rather silly. We're not talking about fashion, and it's important not to fall prey to the Idol of Progress.

It's not the 15th century any more.

> Interesting that more religious countries, like Germany's neighbor Poland, have no such tax.

My point exactly.


Well if you don’t think it’s a big deal, you may collect it just to have the data (data was as appealing to bureaucrats 200 years ago as it is to big-data engineers today). For example the DMV asks for your height and weight today, but if a fat-hating government came to power and decided to kill all fat people, they would have a target list that was of very questionable value for the DMV to collect in the first place.


To be able to give the proper last rights absent any other family. Just in case...


EU's data protection law covers this.


If so, that's great.


You can delete the data from your profile.


Do you have any idea how hard/illegal it is to hack into those servers and manually delete every file relating to JUST your profile without alerting an admin?


You forgot the cold storage backups. Its just plain impossible with anything more serious than a hobby project.


Made me laugh - thank you.


lol, that means nothing. It most likely means that data is marked as deleted but the original values are retained.


Well, you could also just not fill it out in the first place.

To be clear, this is public profile information displayed to any user of the Grindr app.


Except that if a user fills in a Grindr profile, they believe they are only sharing information with Grindr/its users. The truth is that they share all of the information (in less than secure ways) with a variety of third parties and fail to disclose that fact. That's illegal in many countries (including mine) and it demonstrates an unbelievable lack of regard for user privacy or safety.


> That's illegal in many countries (including mine) and it demonstrates an unbelievable lack of regard for user privacy or safety.

There's a very strong chance that this is illegal in many states in the US as well (not sure about federal law).


Not only does it get shared, it gets shared in bulk. That's a huge difference between the intended use and what actually happens.


Correction: you can mark it as deleted.


Reminds me of the relationship between the Mafia and the homosexual community of the early-mid 20th century. The Mafia didn’t particularly like gays, but they ran all of the gay bars because it gave them the opportunity to blackmail their patrons.


> The Mafia didn’t particularly like gays, but they ran all of the gay bars because it gave them the opportunity to blackmail their patrons

Source? Curious to read more about this.


The Mafia controlled most gay bars due to their illegal status, and extracted a monetary premium from the gay community. This recognized both the legal risk the Mob was taking and the near-monopoly status it enjoyed. After all, where else were gay folks going to meet? There were often high cover charges and minimum drink requirements. Moreover, gay men were at risk of blackmail from their Mob overlords

https://www.vice.com/en_us/article/gqmym3/how-the-mafia-once...

I had never heard this before, but it is easy enough to find it. There are at least a couple of other articles that readily came up.


The "Stuff you should know" podcast discussed that on the stonewall uprising episode.

Blackmail probably existed but it was just a great opportunity to sell drugs and booze in illegal bars.


More than that... Bars are great targets for mafia types. My grandfather ran a bar in NYC in the 40s-70s... Any business concession other than the phone company was mob controlled. The cigarette machine, jukebox, tap cleaner, pinball, ice was mob controlled.

When you hit the nadir of NYC in the 70s, it was difficult to make sales tax payments with all of the leeches attached to the business.


"At the time of the 1969 Stonewall Uprising, the Greenwich Village bar Stonewall Inn was owned and operated by the New York Mafia."

http://www.pbs.org/wgbh/americanexperience/features/stonewal...


The fact they could be blackmailed means the rest of the society didn't like them either. If a minority is doing something today that the rest of society thinks is wrong but actually isn't, then don't we have a more serious problem already by persecuting whoever is the modern equivalent of early 1900's gays?


Totally different things and it’s almost an offensive comparison.

Someone’s HIV status, for example, could definitely expose them to social stigma. Is that a problem? Yeah; but it doesn’t change human behavior — there are some things we would prefer to keep private.


Wasn’t that the case with Jack Ruby?


I've hypothesized that the recent blast of negative press about Facebook and other social media companies like this is driven in part by the realization at the heights of US intelligence that these massive surveillance honeypots are a threat to US national security.

If that's the case I agree with them. How many people with secret and top secret clearance have exploitable (or compromised by design) 'smart' devices in their home or could be blackmailed using data in possession of these services? Then there's the whole mass election manipulation angle which just adds to the problem.

The social and 'smart device' panopticon opens the potential for a completely remote cyber-invasion and takeover of the country by a foreign actor. It would be the first full-scale invasion with not only no shots fired but no actual physical army on the ground. I don't think this is really all that sci-fi.


Makes me wonder, what would it take to legislate the use of personal information?

Maybe we are getting close to this point?

Seems like situations that result in direct pressure on the legislators would be more productive than ones involving us, their “employers”...


It’s already legislated in Europe. The new law, GDPR, is coming into force in May across the EU.


Another data point on why you should not take money or sell your business to Chinese investors. Their price points for pricing companies is strange (e.g. a lot higher than western pricing) and it would make sense that they're doing it for other, perhaps more nefarious, reasons.


If I’m selling my company and a Chinese buyer offers me more, I’m not going to turn him down just because he’s Chinese.


The question doesn't mean much without knowing more details.

In general, most people would say that the short-term financial outcome is not the only important factor.


They may say that, but they're probably still going to take the short-term financial outcome because there's no discernible long-term downside for them.


Your patriotism and integrity are noted. The things some people will do for a buck tells you all you need to know about unregulated trade.


Acquisitions for a company's data is not something that's limited to China.


He seems to be suggesting that they place an abnormally high valuation on that data which suggests, maybe I’m misinterpreting?, nefarious aims.


Reminds me of the old KGB tactic where they would have agents seduce politicians and business men into compromising (recorded) sexual situations then use it for blackmail. I find it inevitable that many acting politicians on both sides of the aisle are being manipulated in this way.


-Now, I know you're familiar with the honey pot.

-Seducing and blackmailing a hot female enemy agent. I love the honey pot.

https://www.springfieldspringfield.co.uk/view_episode_script...


In this article, an investigative Romanian journalist covers the story of the "Shanghai" network in Bucharest. This network has been supplying dozens of high ranking Romanian officials with underage girls for 13 years, who in turn were instructed by the Chinese to film the sex scenes with hidden cameras, obviously for blackmail. The leader of the Shanghai network now operates the only working copper mine in Romania, extracting 100% of the copper and exporting it to China, while his citizenship was canceled 4 years ago at the request of Romania's Intelligence Service because of actions which undermine the state's democratic institutions: http://www.tolo.ro/2018/03/24/de-13-ani-filmari-cu-zeci-de-i...

Seems like blackmail works just fine!


That’s the business model of Cambridge Analytica nowadays!

https://www.channel4.com/news/cambridge-analytica-revealed-t...


Who needs to hack government systems anymore!?!?!?


It might be worth making another post to highlight an additional concern: this repository itself appears to leak profile images of many Grindr users. The raw-data folder includes nearly 14,000 files, including many ads and scripts, but also thumbnails of many user profiles. This file[1] for example, once you strip out the HTTP headers, is a JPEG that shows the legs and gym socks of one user. This one [2] shows a user's bare torso.

I would link to others, but most of the ones that I've found include clear views of users' faces, sometimes clothed and sometimes shirtless. In some cases it looks like the photos were taken in their homes. It's ironic that in exposing Grindr's mishandling of users' personal data, this party appears to have mishandled personal data themselves.

[1]: https://github.com/SINTEF-9012/grindr-privacy-leaks/blob/mas...

[2]: https://github.com/SINTEF-9012/grindr-privacy-leaks/blob/mas...


Somebody needs to write a Peter Thiel / Not Peter Thiel script, and cross reference the photos with his Friendster profile pic where he was shirtless on a boat advertising for twinks (pro tip for people who think he was outed by Gawker, he was one of the most obvious closet cases in San Francisco).


Since SINTEF is based in Norway, it's certainly possible that my profile picture is somewhere in there, and that really pisses me off. I'm out and have no problem with people knowing that I'm gay, but I shouldn't have to deal with "researchers" pulling this type of crap.


Are you happier with teens in Slovakia pulling this type of crap? If it isn't locked down someone has already scraped it and saved it forever. That you don't see most of the data being stored about you means nothing.

Just another data-point explaining why in 10 years you got pulled off that Emirates plane in a layover in Dubai and were never seen from again.


To me that's the slippery slope argument.

Sure, someone else could have done it already, or could be doing it right now. But SINTEF is supposed to be reputable research organization, and I think it's more than acceptable to expect them to properly handle any data they gather.

Grindr faces significant hurdles in improving security and complying with GDPR, and we do need to hold them to account, but that does not absolve SINTEF of their responsibilities either.


Good job finding sensitive information in the repo but I hope you ensured that the data is deleted before making this issue public


If you are using Grindr on Android, install and use NetGuard.

https://play.google.com/store/apps/details?id=eu.faircode.ne...

https://github.com/M66B/NetGuard

NetGuard is an open source local VPN that allows you to block DNS lookups to prevent calls to 3rd parties, and it does not require root access.

Calls to all of the 3rd parties mentioned are blockable. Grindr does not need many domains to be operational to work, just their own domains (.grindr.com on 443, grindr.mobi on 443) and a couple of Google static domains like csi.gstatic.com on 443 .

Of course this does not prevent Grindr from rolling up the data and sharing that with 3rd parties, but the linked analysis suggests that this is all via the app making calls rather than the company selling it in bulk.


AFWall is a pretty user-friendly and configurable firewall. Let's you easily block apps from accessing the web.

https://github.com/ukanth/afwall


AFWall requires root, while NetGuard doesn't. Important distinction now that rooting affects ability to use features like Google Pay, Netflix (offline download), etc.


Rooted phone, no google services, and netflix downloads are just fine.


SafetyNet is a cat and mouse game. A lot of people don't have the patience for it.


I've been running Magisk since maybe August or September last year and I have not had a single issue with safetynet.

I have netflix installed (after rooting) without issue, and I use Android/Google pay practically every day along with another app that requires safetyNet without any issue at all.


I've never had a problem over months of using Magisk.


systemless root (su is in the bootloader), like supersu or magisk.


No android pay either.


I am a huge AFWall fan but this is not helping in this case. Grindr users want it to access the internet for matches. AFWall can not block hosts selectively.


I use dns66, how does this compare to netguard?

https://f-droid.org/en/packages/org.jak_linux.dns66/


I've found Blokada to be much more configurable. Just switched from DNS66 to blokada today.

https://blokada.org/index.html

https://f-droid.org/packages/org.blokada.alarm/


I'll try it thanks for the links


A slightly off-topic related question.....does anyone know of an Android app that does not require root, that can find and kill (preferably configurable so it happens on boot) apps that run in the background?

For example, AirDroid - it is very useful occasionally but I don't use it 99% of the time, but it is still there running and killing my battery life. Perhaps I can better configure it, but that's not the point - I'd love to have a solution that puts me as the device owner back in control, I'd like to see what's running, I'd like to auto-kill things I don't want.


I think that's the purpose of Greenify.


Realistically, if they don't care about users data, what hinders them from collecting data on .grindr.com and grindr.mobi and sharing with third-parties?


Man, those features should be part of a system app in the stock Android ROMs. Privacy tools need more public exposure.


So vending HIV status is a straight up HIPAA violation, I'm fairly sure that's been found to be the case over and over again -- it doesn't matter what your business is, health information is covered by HIPAA.

That's 250k per violation fine, and leaking status positive or negative is a violation. And every person, and every time they pass that information to every "partner" is a distinct violation.


> So vending HIV status is a straight up HIPAA violation

If the vendor is a HIPAA covered entity, which Grindr isn't.

> it doesn't matter what your business is,

Yes, it does.

> health information is covered by HIPAA.

PHI held by HIPAA covered entity or by a business associate on behalf of such an entity, sure. Health information shared by the subject outside of a healthcare context, OTOH...


via https://www.hhs.gov/hipaa/for-professionals/security/laws-re...

"The Security Rule applies to health plans, health care clearinghouses, and to any health care provider who transmits health information in electronic form in connection with a transaction for which the Secretary of HHS has adopted standards under HIPAA (the “covered entities”) and to their business associates."

via https://www.hhs.gov/hipaa/for-professionals/privacy/index.ht...

"The HIPAA Privacy Rule establishes national standards to protect individuals’ medical records and other personal health information and applies to health plans, health care clearinghouses, and those health care providers that conduct certain health care transactions electronically."

in what circumstances has HIPAA been found to apply to businesses other than those?


It applies to researchers.


You aren't bound by HIPAA just because a user hands over health related information. You will however fall under it if you are sharing that data with a covered entity..


Grindr isn’t claiming HIPAA compliance, so it doesn’t apply to them. They’re not a healthcare provider (or covered entity), so there’s no requirement for them to obtain HIPAA compliance.


If an acquaintance tells me he has AIDS and I tell someone else, am I in violation of HIPAA?


Not in your personal capacity, no. As mentioned in the other comments to this parent, HIPAA only applies to "covered entities" like doctors that take insurance and insurance companies, and their "business associates" that process PHI on their behalf.


This is like saying Twitter is liable if you tweet your own status.


That's not even close to the same thing. You're comparing involuntary and unknown sharing of personal data with explicit and self-actioned sharing of that data.


The user volunteers their health information into the public domain when they tell Grindr their HIV status. This is information that is already visible to other users to some degree.

Declaring your HIV status on Grindr is voluntarily and knowingly sharing your own health information into the public domain. It is much closer to tweeting it out than telling a medical professional imo.


Valid point; I hadn't considered it that way. There's still a large difference in audience between the two but ultimately you're surrendering the information to unknown parties.


Only if you are a "covered entity"

HIPAA has nothing to do with Grindr unless it starts an acute care clinic.


And that's just US law. Any data stored or transmitted through the EU is subject to even stricter laws.


Does anyone have any information on how Scruff handles that information? Also, does HIPAA say anything about technology companies outside of the medical field's data that may voluntarily collect HIV status?


CEO of SCRUFF here, and daily reader of HN. We have not, do not, and would not share this information with third parties.

The data that we share with our third-party ad providers is:

- Your location (so you can get those local car dealer ads)

- Your gender

- Your age

- The targeting keyword "gay"

We currently use AdMob and MoPub to provide our network advertising.

More broadly, this kind of information is never something we would share. We know the sensitivity of HIV status, and know that it has been used to discriminate against our community in the past. When we do take money from direct advertisers (full-screen ads shown at launch), we make sure that they are promoting relevant and beneficial products for our community, and the ads they place are serviced 100% in-app and come with no extra data nor api calls.

Ultimately, our business model is based on subscriptions, which means we are successful when we make software that people love to use. We don't spend our days trying to squeeze a marginal penny out of some remainder-bin ad unit by trading personal data of our users. Instead, we spend our time focused on how to make an excellent product, that works reliably, is free of spambots and harassment, and connects gay guys with each other and the global gay community.


Thanks for the reply! Long time subscriber, first time caller here. What are you doing to protect the information of people who aren't out or even gay on your platform? Do you share whether or not someone identifies as trans, bisexual, dl, msm, straight etc? You have information that's just as sensitive and private as HIV status that from your reply it seems like you haven't considered. I love your platform and would hate to see anyone outed for any part of their identity by Data you supply to an ad network


How granular is the location data you share, on the scale from postcode to lat/lon?


For these third-party ad networks, they accept a lat/lng, so we (and I suspect most apps) just pass in a value obtained from the device.

That said, upon further reflection, a local advertiser could surely get enough targeting information with a much, much less precise value...we'll look into making this change in a future release.


Weigh that privacy consideration against the value of CPGs looking to spend shopper marketing advertising dollars in-store, just sayin'...


Is any of this data shared with third parties for paid users of the app?


Our paid users do not see third-party ad network ads, so no.

There are other third party services we use that may see various pieces of information from free and paid members. Stripe, for example, manages some payment processing and collects some user data; ZenDesk manages support tickets for us and also collects various pieces of user data.

In the coming months we will be sharing more information about the third parties with whom we work and on whom our service is built, and how data is shared between them. When it comes to ad networks, however, our integration is limited to what I mentioned above.


> Also, does HIPAA say anything about technology companies outside of the medical field's data that may voluntarily collect HIV status?

No, HIPAA binds only covered entities, which are (basically) care providers, insurers, and certain other parties in certain business relationships with care providers and insurers.

If you give out your health information to a dating service, it's not protected by HIPAA.


> No, HIPAA binds only covered entities, which are (basically) care providers, insurers, and certain other parties in certain business relationships with care providers and insurers. If you give out your health information to a dating service, it's not protected by HIPAA.

This is correct, though it's worth noting that HIV status is actually protected under more strict terms than just HIPAA, and that may in fact apply to Grindr.

There are a lot of laws at the state-level which restrict the ability to collect, record, or pass on information related to an individual's HIV status even when none of the parties involved are covered entities (or business associates of covered entities).


> HIPAA binds only covered entities

Covered entities are specifically defined as: "(1) A health plan"; "(2) A health care clearinghouse"; or "(3) A health care provider who transmits any health information in electronic form in connection with a transaction covered by this subchapter" [1].

[1] https://www.gpo.gov/fdsys/pkg/CFR-2017-title45-vol1/xml/CFR-...


Then why are research institutions included?


They aren't, as such, however they usually get data through business relationships with providers or payers (or, in many cases, are providers.)


That's not to say there may not be other laws that govern how you protect a user's data though.


Such as? I'm not under the impression that there are any legal prohibitions on this kind of thing in the US.


The first that comes to mind is the Federal Trade Commission Act, which allows the FTC to charge companies which fail to protect consumer personal data.


That's the problem.


For now at least...


From Scruff's privacy policy: "For example, we share your device identifier and demographic information with our advertising display and analytics partners."

The whole thing seems pretty bad (privacy-wise) https://www.scruff.com/privacy/


Hi -- See my reply above for a list of what we share. HIV status is not part of this data, and indeed is not something we directly collect.


I don't get why they don't explicitly state HIV status, I'm assuming it's covered in this, "Any other information you voluntarily provide us with in connection with your registering and using the Service.", but again, not 100%.


I believe HIV status is a special protected piece of information and consent needs to be given before you can share it.

http://www.aidslawpa.org/get-help/legal-information/confiden...


Those laws apply to healthcare providers, not to social networks.


Well, in the case of PA (as an example) i believe they have a `Subsequent disclosure prohibited` portion of their statute which covers anyone who has the data preventing them from further sharing it without consent.


For HIV status disclosed as part of medical services. In the case of Grindr, HIV status is voluntarily collected as demographic data for its social network.

Its also available to anyone who sees the profile in their search results. As far as I can tell you cant have a profile that is active, use it and not show up in search results.


The "I" in HIPAA seems to go ignored when talking about health data. If that letter doesn't apply, then HIPAA almost certainly doesn't matter.


I don't think HIPAA would have any jurisdiction in this area.


I'd be interested to see how Scruff/Jack'd/etc stacks up. My guess is Scruff does better (it has always been a better designed/developed app) but I understand why they focused only on Grindr as it does have the largest market share (admittedly a guess).

Grindr has never been exactly a bastion of good programming... Their app has always been subpar at best with infrequent updates, months/year long bugs, terrible UI/Navigation, lack of features that could be coded up in a weeks time that would GREATLY improve the experience (Message archival/hiding), and I could go on. It would be one thing if they features were relegated to the paid version (Grindr Xtra) but the only really big feature for Xtra is push notifications for when you get a new message.

All of this is to say the fact they are using HTTP to talk to these analytics/ad companies doesn't shock me at all. My bet is they haven't updated the libraries for these services in forever (which wouldn't be too hard to investigate).

As for HIV status getting sent it really depends on the service. They are not subject to HIPAA (even if you wish they were) so they can do this and I'm sure for targeting ads it makes sense. No need to waste ad dollars on "Get tested for HIV" for people who already know they are positive. As someone in this community and knows the orgs that pay for some of these ads are severely underfunded I have hard time saying this isn't important to make sure your ad dollars go as far as they can.

Lastly for people saying "just don't enter your status" you clearly don't understand this community, I'm sorry. But people who are positive face a HUGE stigma. Chatting on Grindr/Scruff is already an emotionally draining experience in a lot of cases, I don't you all want the details but let's just say failed conversations (for most people at least) don't exactly fill you with confidence/self-worth (yes there is a whole other discussion to be had there I'm sure). So waiting until you start a conversation to tell someone you are positive (instead of it being in your profile) is going to lead to even more failed conversations. If I were positive I think I'd trade my status away to analytics/ad companies in exchange for not having to talk to people who aren't interested in the first place. I'm saying that as a white male living in the US so depending on your situation you may disagree.


the CEO of Scruff replied elsewhere in this thread:

https://news.ycombinator.com/item?id=16738283


Grindr has the largest share of users by far. I suspect SCRUFF is #2, both because it is almost as old, but also because they clearly skew towards a different segment of the community. I assume that Jack'd and Hornet are more popular in certain geographical areas.

But I agree with pretty much everything you say, except for wanting to emphasis that getting people to enter their status is crucial in helping to normalize regular testing, safe sex practices, and allowing HIV+ individuals to be open members of the community.


It's kind of funny, and while this wasn't always the case until I educated myself on it, but I trust/feel safer with people who are positive more than those who aren't in some ways. Having sex with something who is positive but undetectable (even unprotected) is better than having sex with someone who doesn't know their status. Even putting HIV aside people who are positive are being tested much more regularly than the general public. Similarly to people on PrEP as they are required to get tested (only for HIV but I know a number of people who also take the opportunity to get tested for other things) every 3 months.


A bit unrelated, but imagine how much data has Tinder collected, if Cambridge Analytica could do that much with just a comparatively unpopular quiz app.


The quiz app wasn't the real data source though was it? People who want to take the quiz have to press "yes" on a menu that vaguely mentions "We need your friends list" and things like that, and those permissions were used to harvest the data as far as I understood, not the quiz itself.


Yes, the quiz was the bait to get to the real prize, your friends list and their info, likes, etc.

However, your answers to a different Facebook quiz in 2014 were scrutinized by Cambridge Analytica, who were looking for the “dark triad” of personality traits, scouting for sociopaths:

https://mobile.twitter.com/carolecadwalla/status/97590547206...


Yet Tinder is just one of the 150 online brands owned by IAC, a company created by legendary TV executive Barry Diller. He probably knows the value of the data they have.


Wow, today I learned: Tinder, PoF, OkCupid, Match.com and more are owned by the same company.


Yeah and OkCupid's blog had a post that was very critical of match but mysterious disappeared when they were bought out.


And for awhile they were owned by one of the most evil companies of them all... Ticketmaster!


Correct me if I'm wrong, but wasn't the quiz app just a way to gain access to a user's data? So it wasn't so much the content of the quiz that was revealing, just the access to data it provided.


I'm pretty sure their "psychometrics" play required the data users gave them as part of the quiz. The whole point of the exercise was to supposedly figure out people's personality traits from their social media profiles and let marketers target them based on that, and I think the personality quiz was probably the only way they could get ground truth information to train their model on.


We need a new business model for social media, one which actually serves the customer instead of trying to lure them into productizing themselves.


So you're going to pay to use it right? If this imaginary product managed to get 1% of Facebook to hand over their credit card. That would be the most wildly successful pay to use application in existence. Will never happen. Social Media sites rely on mass adoption, among those who refuse to pay for just about anything.


If "no pay" is an option most people will take it, if it isn't an option because the mechanisms that make it possible are banned then people will pay - or there will be no service.

This isn't physics; the laws of user behaviour are mutable, and legislation is completely possible.

The outcome of the erosion of privacy is the end of our ability to adapt as a society; there will be no new ways of doing things or dealing with change. This will be bad, we have some very very nasty societal problems coming and we need lots of new ways of dealing.


I understand the privacy issues are horrible, and places like HN are super vocal about it. But the overall majority either don't know, or don't care, or mostly both. People dump their entire lives on to the internet, hoping for some type of validation from their peers. Banks destroyed many lives ten years ago, and yet we bailed them out, and are still using them. People in the end just don't care. Especially after time has passed.


Banks are necessary for modern economies. Social networks, not so much.


Which is why protections must be put in place. If a business model is detrimental to the health of a nation, then maybe the laws need to change to prohibit such a business model from preying on the citizenry.


Grindr has a paid-for upgrade, called Grindr XTRA. I wonder if paying users, who don't see ads, also don't have their data shared?

https://help.grindr.com/hc/en-us/articles/115008879108-What-...


People will hate this suggestion on ideological grounds but a mandatory tax to fund an NGO is the way to go. Something like a $1/person/year tax is already plenty of money if you are just trying to maintain bare bones social networking platforms, without all the added surveillance and advertising machinery.


NGO?


"Non-governmental organization", kind of an odd term to apply to an organization defined by the government and funded through taxes.


> So you're going to pay to use it right?

Yep -- I'm willing to pay $5/mo to host stuff. That's roughly 10% of an average cell phone bill, or less than 1% of rent (even outside the cool places).


Just as some people pay for fastmail I am pretty sure there'd be some people paying an entrance fee to a social network.

Wasn't the social network of last week, vero, supposed to be like that ?


Network effects are brutally hard to overcome.

https://www.digitaltrends.com/social-media/app-net-shut-down...


It's not network effects. It's hostile copyright and network access law. There's no reason that you shouldn't be able to interact with the content streams from both Twitter and App.Net at the same time, for example, except that our legal system allows the dominant company to sue the scrappy upstart into oblivion if they begin to do so. This is not theoretical; small companies are destroyed frequently by BigCo breaking out the lawyers for supposed violations of the CFAA and the Copyright Act.

This sorry legal situation has completely obliterated meaningful competition in the online space, and we are letting them get away with it when we just hand-wave that it's because of "network effects". There's no reason that YoungSiteA shouldn't be able to act as a user agent on my behalf to access and reskin OldSiteB. If it could, the negative impact of switching providers would be small.


Email is a poor comparison, because you're connecting to millions of other people using gmail/yahoo/whatever, for free. The correct comparison would be if eMail was a universally paid for service. It would NOT be what it is today.


I am not comparing the aims or the essence of the service. I am expression the opinion that social networks can be a paid-for service just like email, music or video can.


Right, but the problem here is that social networks need to grow exponentially. Without massive growth, they are useless, because you're not connected to anyone else. Email works, because everyone has email, mostly because it's free. Which leads to fast adoption.


I'm actually not that optimistic, but it seems like after nearly two decades of declines due to the internet, serious news outlets are getting more people to pay for online newspapers

https://techcrunch.com/2017/03/04/why-newspaper-subscription...

Maybe it will take another 10 years, but this gives me some hope that as consumers become more sophisticated (and jaded towards new trends), there's a chance society at large will be willing to pay real money to sidestep the worst side effects of ad-driven products.


Someone has to pay for your service. This can either be your users or a 3rd party. We know the overall acceptance of paying for online services is fairly low. Even high quality news outlets have problems monetising their content. In this case you're also competing with "free" services like FB. Unless you can offer something that is enough of an incentive for people to pay, this is not a battle you can win. What's more, social networks live and die by the network effect. People will be even less likely to pay for your service if there is no one already on it they know. In the end, you are choking off your growth through the higher barrier of entry before you can even get to the phase of a positive feedback loop in user numbers.

This only leaves the 3rd party option. But someone like that will always want something in return, and the only thing that you have to offer is access to your users and their data.


I don't think that's true. If you were constrained in how you use user data, you could make money in other, more difficult ways.

Monetize photo services, music, sell access to the audience for applications or monetize the marketplace.

Pillaging users is easy. But if regulators prevented it, Facebook is still a viable business -- maybe even a better one.


I would argue that the selling of data is the second most destructive byproduct of social media, the first being dehumanization becoming part and parcel with online communication - and its bleeding into offline.

I believe the reason that extreme ideologies on both sides of the spectrum have gained a foothold in recent years is because their ideas lend themselves to the hyperbolic, binary patterns of online discussion.

We should put "social media" to pasture and try to come up with something better.


I personally believe that social media may be one of the best use cases for blockchain technology. I think that each node "paying" for their account with some sort of resource usage could be sustainable, but I am not a blockchain expert. It would be interesting to see if something like that could work without turning into a ICO money-grab.


Is there any evidence to suggest that it's even possible to combine "social media" with "business model" and not end up with exactly these sorts of issues? Users do not appear to have any appetite for pay-to-play social media offerings, which leaves advertising as the primary revenue stream.


I've been thinking about this quite a bit, but do you think anyone would actually pay for social media where they actually can own the data? I think the use case might be difficult given the convenience and 'free' aspect of Facebook, Twitter, etc.


If you're not paying for anything, you're not the customer.


The problem obviously is competing with free...although, maybe we are beginning to realize that there are hidden costs to "free"...


I really want to see diaspora* taking over and the beginning of the decentralization era.


>the decentralization era

do you have any reason to believe that this will be "a thing" beyond highly obscure internet subcultures like HN and reddit?


I wouldn't hold my breath. The project has been out for ~8 years and has failed to gain any traction.


my circle of friends has started feeling it out in response to the recent facebook fracas. I recommend other people start feeling it out, too.


How does that compare to say Mastodon?


Mastodon to diaspora is like twitter to facebook.


For what it's worth, the most private data here is shared to analytics companies for Grindr's only analytical use. My guess is that Grindr's agreement with Apptimize and Localytics asks for the strictest possible protection of that data. If anyone at Apptimize or Localytics has access to that data, I'd be incredibly surprised.

This sort of deal isn't the same as sharing the HIV status to Google or Facebook so that advertisers can target or exclude that user information for the purposes of advertising.

For people who think this is still wrong, I'm curious what their pragmatic alternative is. How else are app developers supposed to analyze their app performance? The open source, self-hosted pickings are slim. (I can only think of Piwik, which in my experience has a dated feature set and severe performance issues.) Not everyone can afford to perform their own product analysis. Using a third-party analytics saas is kind of the only way to go and seems like a reasonable tradeoff of security for product visibility.


As someone who has been working in security for a long time, and has seen how the sausage is made at even the biggest, most reputable companies who “take security very seriously”, the “strictest possible protection of that data” means approximately nothing. The only serious way to protect sensitive data is not to take it in the first place. Hell, not even the NSA can keep a lid on their sensitive data.

”For people who think this is still wrong, I'm curious what their pragmatic alternative is. How else are app developers supposed to analyze their app performance?”

Remember, customers first, your “needs” come second. That goes double when they are placing their trust in you by allowing you to be a custodian of their data.

Not long ago, desktop software phoning home would have been a scandal. Not long before that, it was offline and couldn’t phone home. Yet, we still had software. Unfortunately, developers have taken the slipperly slope all the way to outright abuse of their privileges in order to collect information that customers don’t know about or understand. This has led us to things like GDPR. It doesn’t matter if your intentions are good or your usage is benign. It isn’t yours to begin with, those aren’t your decisions to make, and developers need to learn to seriously respect that.


1) At least don't send any personal data over http. It's 2018 for fucks sake. I can't believe there are companies out there with such a hand-wavy approach to this. Is it so hard to do https in this day and age? It's so basic wrt to a security audit, my head hurts. The fact that extra data is sent over https shows that they made an active decision to partition this data into non-important/important.

2) Just don't fucking send it to a third party. Every single time you do that you yield control over the data, introduce another party to the mechanics thus doubling the risk of disclosure and they you cry 'breach of trust'.

> Not everyone can afford to perform their own product analysis.

Then don't do it and don't store sensitive information. You're taking on a risk and if you don't have the money to roll your own analytics then you probably don't belong on the market. This is no longer a playground, this is the real world, especially for this kind of information. People can get killed based on Grindr leaks. It's the big boys game and if you don't have the backing, you shouldn't play in the first place. And this app specifically should not have any problems with funding, give me a break.


Why do you need to share HIV/status as part of performance analysis of a web portal or app?


So it looks like the requests go to profile.localytics.com which is the API used for https://www.localytics.com/profiles/.

So not used for performance, but instead "A people-centered and personalized approach to app marketing and analytics". I am not sure if this is better or worse.


Drugs marketing, supplements? I'm sure there's stuff you can sell more easilly to HIV positive people.


> My guess is that Grindr's agreement with Apptimize and Localytics asks for the strictest possible protection of that data. If anyone at Apptimize or Localytics has access to that data, I'd be incredibly surprised.

Honest question, are you in the SAAS analytics industry or is anyone else that can comment on this? I am not (though I do do data work) and I would actually be surprised if the SAAS company _didn't_ have access to the data.

That would require some kind of dedicated setup so that Grindr's data was not at rest with other company's data which is a) super expensive, b) no reason to expect that the SAAS company would not have access for maintenance/troubleshooting and c) kind of defeats the purpose of using SAAS.


For startups of their sizes, it's unlikely they have strict data controls. So, probably anyone working on the product side of things, support, engineering, services, has access to their analytics data. Basically, most of the company likely has access to that data. Grindr really shouldn't be sending that data to their analytics providers.


They have the option of not sending HIV status to any third party.


Why would you ever default to an opt-out for that information? That's like saying "people should read the contracts" while waving about a 10,000 EULA in 6 point type, or burying an option checklist so deeply that most users don't even know it's there.

“But the plans were on display…”

“On display? I eventually had to go down to the cellar to find them.”

“That’s the display department.”

“With a flashlight.”

“Ah, well, the lights had probably gone.”

“So had the stairs.”

“But look, you found the notice, didn’t you?”

“Yes,” said Arthur, “yes I did. It was on display in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying ‘Beware of the Leopard.”


I don't think the distinction between "third party service" and "hosting company" is all that clear. You're sending data to a third party service when you host an app on AWS. The only data protection you have is contractual.


> You're sending data to a third party service when you host an app on AWS.

Amazon neither receives nor requires access to the raw underlying data (in this case: data in your database indicating HIV status, or decrypted bodies of requests sent over TLS indicating same) when you host your web services on AWS. While, yes, it's possible for a dedicated attacker to intercept and snoop on this data, it's (a) not easy, and (b) very much outside the bounds of the scope of the relationship you have with them.

Contrast to the setup described here, where the third parties in question both received and required access to the raw underlying data in order to perform the services they were explicitly contracted for.

You may not think this is an important distinction, but legally, it is, and it makes a world of difference.


I honestly don't get the distinction you're making here. I understand how people _can_ use AWS without ever letting sensitive data touch their disks, but most apps hand everything over wholesale (and frequently in a nicely structured format on RDS).

The legal distinction you're making doesn't sound right to me. Contractors for companies that access your data aren't usually about whether or not an attacker can get at it, but about what kind of access an employee of the service itself has.

Amazon _technically_ has complete access to your data when you run on AWS, but they're contractually limited in how they can use it. The same goes for third party SaaS services. The major difference is "who writes the logic".

But I'm not a lawyer and won't ever have to argue that somewhere it matters.


Amazon is selling an abstraction, and goes to great expense to not have access to customer data. If you are a HIPPA covered entity, they sign a BAA that puts them on the hook.

It's like the difference between putting your papers in a storage locker versus your friends garage. The storage company ultimately has access to the locker, but is less likely to snoop (either consciously or accidentally) than any of the folks with access to that garage.


But you've just described a contractual agreement. You're still sending data to a third party. I'm not sure we're disagreeing here.


Would this be a better to distinguish?

AWS does not care about the data, does not want to see the data and goes out of its way to make it damn hard for it to see the data. The data is a black box to them and this is by design. You are not sending them the raw data in a format that they require for analyses. You are just sending them bits and bytes that they store for you.

The analyses third-parties in this case are the exact opposite. They explicitly require access to their data in a certain format for analysis. In fact, their business fails if they don't have access to this data.

They are both technically third parties but the way they handle the data is completely different. One has every incentive to avoid reading the data, the other has every incentive to hoover everything it can.


I just don't think that's a meaningful distinction. There's no distinct line between "company that hosts all your data but doesn't analyze it" and "company that does data analytics on your data". It's a gradient, there are all kinds of companies that fit on that gradient, and it's weird to lambast people for using those companies as if it's a technical choice, when what we really want is people making good choices about the data protections their providers have in place.

AWS even has analytics products that require access to your data. I generally trust those more than sketchy analytics companies, but it's entirely because of the contractual protections AWS has in place, not because they're inherently different.


What is the privacy distinction between a third party with a contractual agreement and an employee with a contractual agreement?

Remember that Russian intelligence got a spy hired by Microsoft: https://www.theguardian.com/technology/2010/jul/14/russian-s... Will your interview questions find a foreign spy, or someone who isn't even a spy but is interested in looking at private data for personal amusement?


If Microsoft implemented proper security policies, I imagine that guy didn't have access to all of Microsoft's user data.

So that would be the main difference. Virtually all of a company's employees shouldn't have access to user data at all, and those that do would only have access to parts of it.


>If Microsoft implemented proper security policies, I imagine that guy didn't have access to all of Microsoft's user data.

This is precisely the sort of thing Microsoft takes incredibly seriously internally. Tim Cook may be a more vocal spokesman for treating user data with care but Microsoft is fanatical about it internally. They recognize the risk they face in the event of compromise and have made just enough mistakes in the past to appreciate how hard it is to actually protect their customers’ information.


This might be a naive view, but I think companies that are good at this sort of segmentation will also be good at picking trustworthy third parties and limiting (both technically and legally) what they can do, and conversely, that companies that just send a bunch of sensitive data to third parties out of laziness have no meaningful internal controls either.

"Avoid third parties" is an occasional effect of conscientious care of data, not a cause of it.


What?

How about "lets just not spend medically sensitive information to third party services"


That seems reasonable, but can we also say, let's not send medically sensiive information to every employee at the first party?


You could not fill out that part of the profile.


> Not everyone can afford to perform their own product analysis.

Just because ethical behavior is expensive doesn't mean you have a license to do whatever you want.


I don't think you'll find total agreement on what behavior is considered ethical, particularly in this product space.


The reasonable tradeoff, in this case, would be to continue using the third-party analytics saas, but exclude personally identifiable information, or at the very least, exclude this extremely sensitive information.


I think it depends on your reasoning for not sharing data to third parties.

It seems like you're arguing that sharing data is wrong because, in the wrong hands, the data could be used to personally identify someone. In my mind, these are the ways that can happen:

1. The data is sent to an advertiser who can target based on that data. Seems possible, so it's relevant that this data isn't being shared with an ad firm. 2. The data is sent to a third party, whose employees can access and leak the data. 3. The data is sent to a third party, whose data gets compromised.

So the trade-off is, what is the value of having user information in a tool for analytics purposes, versus the chance that (2) or (3) (or any unknowns happen)? My argument is that analytics firms are not in the business of leaking or selling data; their business hinges on their client's data privacy. So to me, this seems like a reasonable trade-off for certain types of data.

As for whether HIV status is the type of data that's unreasonable... I can buy that argument either way. I've never used Grindr but I can imagine it being extremely relevant to its users. And any data that has product impact is useful in an analytics setting. For example, if Grindr has some features that make it easier for HIV-positive or negative people to filter, then they'd be interested in understanding whether it's being used in the product. Then again, I can equally see them deciding it's not worth the risk, and removing it.

If you think sharing sensitive data is wrong under all circumstances, on principle, then you're entitled to your beliefs, but that would seem to me awfully close to religion.


> How else are app developers supposed to analyze their app performance? The open source, self-hosted pickings are slim

There is Countly Enterprise Edition for this purpose (both mobile and web). Privacy focusing and on-prem installation.


This type of data is already protected under HIPAA and HITECH. Expand those protections to cover non-providers.


(Copy-pasting the message I already posted in this thread. Seems more relevant here)

I think that adoption of privacy preserving data aggregation/analysis will become the norm. The most immediate applications are 1) telemetry data that is used for monitoring (for example, Google Chrome uses differential privacy for collecting this data), and, 2) services like Google Maps and Tinder-like dating apps. In these applications, essential user information can be represented as integer/boolean values (is user present in location X? True or False. how old is the user? device CPU usage right now? ...) Based on my limited understanding* of differential privacy, it falls short on exactness (of aggregate values) and robustness (against malicious clients). I've lately been studying the literature on function secret sharing and I think it is a better alternative to DP. Take this paper: https://www.henrycg.com/files/academic/pres/nsdi17prio-slide....

Prio: Private, Robust and Scalable Computation of Aggregate Statistics

Data collection and aggregation is performed by multiple servers. Every user splits up her response into multiple shares and sends one share to each server. I've understood how private sums can be computed. Let me explain it with a straw-man scheme.

Example (slide 26):

x_1 (user 1 is on Bay Bridge):- true == 1 == 15 + (-12) + (-2)

x_2 (user 2 is on Bay Bridge):- false == 1 == (-10) + 7 + 3 ...

If all users send shares of their data to the servers in this manner AND as long as at least one server doesn't reveal the identities of the people who sent it responses, the servers can exchange the sum of the shares they've received. Adding the three responses will allow the servers to infer that there are _ number of users on Bay Bridge without revealing their identities.

This system can be made robust by using Secret-shared non-interactive proofs (SNIPs). This allows servers to test if Valid(X) holds without leaking X.

The authors also bring up the literature on computing interesting aggregates using private sums: average, variance, most popular (approx.), min and max (approx.), quality of regression model R^2, least-squares regression, stochastic gradient descent.

Bottom line: I found the discussion on deployment scenarios very interesting. Data servers with jurisdictional/geographical diversity, app store-app developer collaborations for eliminating risk in telemetry data analysis, enterprises contracting with external auditors for analyzing customer data, etc.

* - I understand the randomized response and, to some extent, the RAPPOR technique (used for collecting Chrome telemetry data) but the other literature in that community goes over my head.

* * - This technique is a black box to me at the moment.


>>For people who think this is still wrong, I'm curious what their pragmatic alternative is.

Use the services you mentioned but DO NOT SEND HIV DATA TO THE ANALYTICS COMPANIES. Holy hell, how hard is that? Just omit that part.


Could you please not use allcaps for emphasis in HN comments?

This is in the site guidelines: https://news.ycombinator.com/newsguidelines.html.


Or just don't fill out that part of your profile.


If one is HIV positive it would probably be a draw of the app to find only others who are also afflicted. Turning it off might result in some illegal decisions.


Grindr wouldn’t be the only way to declare one’s STD status though. It could be omitted from one’s profile, but declared during chat, for example, or prior to hooking up.


Methinks you don't understand the problem space. Putting it in your profile is intended to save the afflicted from wasting tons of time and energy on a. talking with people who will immediately nope out when they learn your status and b. dealing with a lot of emotional BS from people who want to see themselves as nice but who aren't really ready to deal with you and your situation.

That can be a hard enough conversation to have even if they know. Being straight up rejected by some high percentage of people who started to chat you up and are done the minute you mention HIV would be a dreadful experience. It's possible they are on the app precisely for the ability to pre-screen people for their willingness to hook up with someone HIV positive.


Given the current reality, you can do either of the following here:

A) Mark yourself as HIV positive in your profile, which Grindr will share with third parties.

B) Directly declare yourself as HIV-positive at some point in a conversation.

I’m not suggesting there are options without drawbacks.


I imagine a lot of folks declared it in their profile without knowing it would be shared and they probably declared to try to have a more positive experience over having one dreaded discussion after another. I think those folks have reason to be upset.


I agree that they have every right to be upset. Grindr didn’t need to share that kind of information with third parties.


Illegal how?


Presumably illegal in the sense of https://en.wikipedia.org/wiki/Criminal_transmission_of_HIV (and note that in some jurisdictions "exposure", not just "transmission" is criminalized, per that wikipedia article).


Except you can decide if you want HIV status to be displayed and the last time I mucked with my profile, it was optional anyhow.

Its not even clear if whats being sent to the analytical companies is whats displayed in the profile or all fields even if they are not displayed.


I believe in certain places that if you withhold the information of being HIV positive from someone you have sex with and infect them, it's a pretty serious crime. I think it might even be a crime if you don't infect them.


Depends on the state. The majority of states either have laws relating to disclosing known STDs or laws specifically about disclosing known HIV status, or both.


Or don't use Grindr.


I used to be a data engineer at an ad tech company, Blis. A huge proportion of the GPS data we relied upon for retargeting and enrichment of the bids came from Grindr, but even so we almost never bid on traffic from them, the brands we worked with were opposed to being associated with that app. So we benefited a lot from Grindr data without giving much back.


Grindr has health-related datas and share it... And I guess that they have some european customers, right? Might be a really nice case for GDPR in 2 months !!! :-)


It's also 100% owned now by a Chinese software company, so might as well assume everything you share there is visible to the Chinese gov't while you're at it.


It's scary that it doesn't surprise me anymore.

Especially social networks are considered most lucrative in terms of targeted marketing and data mining, and it's obvious why. Social networking remains a big deal, it's almost mandatory to have some social networking footprint online, or else you miss out on social life. Why is it still OK to trade data distilled from social media accounts? It's not! One of the many reasons and implications are in that article.

Is independent social media possible? How to fund basic service infrastructure if not by running online ads, or trading user data? Is decentralized social media feasible, and who maintains a decentralized service if it is?

EDIT: If an app developer wants to analyze how the app performs, why share most intimate user data with third parties, Facebook being one of them?


> How to fund basic service infrastructure if not by running online ads, or trading user data?

Subscriptions, combined with community awareness advertisements showing where users' money is going to (and where it is not going to) and why it benefits them.


Yes, that's usually how it works with Netflix, Spotify and similar single-purpose services. But social media? If you take social network as a single-purpose service then there is no need for several or dozens of different similar services, because it wouldn't make sense as a social network, if you cannot interconnect them.

Suppose you pay a subscription for one social network, you'll need to pay also for the other one, to get connected to people on that other social network! It needs to be one big single social network and everyone pays for one subscription. Anything else doesn't make sense, but a single social platform as a network is unthinkable. So, basically, subscription doesn't make sense for social networking.


I don't understand your point here. Just like people pay separate subscription fees for Netflix and Hulu, which essentially provide the same service with different content, why wouldn't people pay separate subscription fees for Twitter and Facebook, if it were an option to avoid ads?


Because, in the end it would cost a fortune? Imagine that, most people only pay for one content provider.

A social network is not a content provider, just a different kind of social network. If you cannot afford a subscription of some social network, you miss out. And that's not acceptable in terms of social life of a human being. Rich folks would be able to roam all networks, poor people wouldn't. So, if you only can afford some "cheap" networks, the whole thing ceases to make sense, as soon as you are barred from accessing other networks. Exclusiveness in social media is an oxymoron. See where this is going?


I think that adoption of privacy preserving data aggregation/analysis will become the norm. The most immediate applications are 1) telemetry data that is used for monitoring (for example, Google Chrome uses differential privacy for collecting this data), and, 2) services like Google Maps and Tinder-like dating apps. In these applications, essential user information can be represented as integer/boolean values (is user present in location X? True or False. how old is the user? ...)

Based on my limited understanding* of differential privacy, it falls short on exactness (of aggregate values) and robustness (against malicious clients). I've lately been studying the literature on function secret sharing and I think it is a better alternative to DP. Take this paper: https://www.henrycg.com/files/academic/pres/nsdi17prio-slide....

Prio: Private, Robust and Scalable Computation of Aggregate Statistics

Data collection and aggregation is performed by multiple servers. Every user splits up her response into multiple shares and sends one share to each server. I've understood how private sums can be computed. Let me explain it with a straw-man scheme.

Example (slide 26):

x_1 (user 1 is on Bay Bridge):- true == 1 == 15 + (-12) + (-2)

x_2 (user 2 is on Bay Bridge):- false == 1 == (-10) + 7 + 3 ...

If all users send shares of their data to the servers in this manner AND as long as at least one server doesn't reveal the identities of the people who sent it responses, the servers can exchange the sum of the shares they've received. Adding the three responses will allow the servers to infer that there are _ number of users on Bay Bridge without revealing their identities.

This system can be made robust by using Secret-shared non-interactive proofs (SNIPs). This allows servers to test if Valid(X) holds without leaking X.

The authors also bring up the literature on computing interesting aggregates using private sums: average, variance, most popular (approx.), min and max (approx.), quality of regression model R^2, least-squares regression, stochastic gradient descent.

Bottom line: I found the discussion on deployment scenarios very interesting. Data servers with jurisdictional/geographical diversity, app store-app developer collaborations for eliminating risk in telemetry data analysis, enterprises contracting with external auditors for analyzing customer data, etc.

* - I understand the randomized response and, to some extent, the RAPPOR technique (used for collecting Chrome telemetry data) but the other literature in that community goes over my head.

* * - This technique is a black box to me at the moment.


This requires trusting at least one of the servers to store your raw data, correct? Local differential privacy doesn't.

You might imagine having servers in different countries so that the whole system can't be subpoenaed by any one country, but I believe this is a legal gray area that becomes illegal if you say you're doing this to be intentionally subpoena-proof (in the US).


>>> becomes illegal if you say you're doing this to be intentionally subpoena-proof

I didn't know that. Hmm...


I think we all agree on how stupid is to track all the little details (including positions, hiv status, etc...) for the only purpose of making money, but I would like to underline that there are only two reasons to not use https today. You’re stupid or you’re lazy.

Can’t tell the worst, but I can tell that users should completely delete their Grindr account, now.


Also, many mobile users name their device their whole name, effectively deanonymizing all their app usage for the massive ecosystem of marketing companies out there. Having worked in the mobile marketing industry I was shocked at how many people were doing this and probably had no idea this was the case.


Is that an iPhone thing?

I know it's not possible to set the hostname of your phone on most android phones but it's unique.

Of course with Android making build versions and etc. Available to devs and in the case of chrome to your user agent. You're pretty much identifible.


Oh, but it is possible. It's buried in the networking settings but there's a phone name option which is used for just about everything.

Although on my LG devices it defaults to the series name, for example G3 for an LG G3.


This is deeply troubling. Anyone who uses Tinder or any other dating site should try requesting their data and realize that these services could likely label you a sexual deviant, racist or otherwise based on your swipes alone.



OkCupid may be even more dangerous in that regard. As far as I know they allow you to set ethnicity preferences.


Most dating services do. Why shouldn't they?


I think the reasoning is innocent, the problem is that if it gets leaked you have people (probably incorrectly) interpreting things using the data which is half the problem.

"Your ethnicity preference is white? You goddamn racist".

This gets worse when those interpreters are in a position of power.


They probably don't even know. The AI which learned on the data doesn't necessarily has labels, it just connects the dots.


1) it's all in the terms of service. Idk why anybody is surprised. They own everything you enter into the app anywhere full stop.

2) it's not going anywhere. Its the gay Facebook. It has monopolized the market of an already vulnerable demographic so they can do whatever they want and still charge an extraordinary amount (almost $20 per month??) and provide no customer service.

The app doesn't even function as advertised (at least on Android). Push notifications and read receipts have been broken for years. Btw if you restrict the permissions of the app they permanently change your status to offline.


I am shocked, truly shocked at this development. An app that collects user data and passes it on to third parties without users’ consent? Unprecedented!


I remember this paper on ad intelligence I read a few weeks ago: "Exploring ADINT: Using Ad Targeting for Surveillance on a Budget — or — How Alice Can Buy Ads to Track Bob".

https://adint.cs.washington.edu/ADINT.pdf


ISTM that "poz" "tribe" is largely equivalent to a positive HIV status?

If they're this sloppy when the client device is on one end of the connection, how sloppy are they once the data is on their end and we can't see what they're doing?


These are 3rd party analytics firms and not any random companies. Both these firms have strong data protection processes and are very secure. From Grindr's perspective, they are probably looking for analytics for different segments of their users and send all data to Localytics who help them with this (vs. trying to build these internally).

Here is a thought. Do we think that the data is more secure with Grindr itself or with Localytics? I feel the answer might be the latter given data security means a lot to Localytics (as they provide analytics as a service to thousands of apps) vs. Grindr itself who may not go to the extent of Localytics to safefuard user info.


The data is already in Grindr’s systems, this means it is ALSO at Localytics (and others). This is not as safe as if it was only in Grindr’s own systems.


And any of these companies could be purchased by some other entity and change their data privacy policy.


This is not really true. These companies entire model is handling data and they charge their partners for it. It is not a free product. As such, going off and selling the data is not really going to happen.


Huh? That has nothing to do with what I said.

I said if any of these companies get acquired, the acquiring party can do whatever the hell they want with that data, previous privacy policy be-damned. This has already happened. A lot.


It really depends on how you as an entity who is supplying data to analytics firm have negotiated your contract with the analytics firm. Just saying an acquisition will completely negate all contracts before is incorrect.

Also, these agreements are between 2 companies (the company providing the data and the analytics firm which will show analytics on that data). There is no privacy policy per se in this case. It is all about contract negotiation on the rights you give to the analytics firm on how to share the data and what happens if they get acquired.


Fair point. My point is that Grindr's system are likely more vulnerable than Localytics and thus that is where the soft point exists if you are a hacker and are trying to get the data.


It's become clear over the last year there is a strong need for a data privacy regulatory agency in US government. I understand that regulation hampers growth, but the tech industry is mature and developed to the point that it's time to reel in "moving fast and breaking things" a bit.


> It's become clear over the last year there is a strong need for a data privacy regulatory agency in US government.

I wouldn't trust a governmental regulatory agency to aggressively fulfill it's mission. My impression is that in general they're too much at the mercy of politicians.

I suspect a more effective strategy is to enact legislation that makes companies liable under civil law, with private citizens empowered to sue.


> I wouldn't trust a governmental agency to aggressively fulfill it's mission. My impression is that in general they're too much at the mercy of politicians.

I think this sort of attitude is a big problem - instead of throwing our hands up and deciding that regulatory agencies have always been and thus will always be toothless, we need to give said agencies more power to enable them to actually enforce the rules. Right now, companies regularly weigh the costs of compliance against the costs of non-compliance because to a multinational corporation, the usual fines and punishments just don't hurt that much.

Companies that break the law should regularly never, ever recover financially.


> Companies that break the law should regularly never, ever recover financially.

Personally I would have this reworded to "Companies that regularly break the law should never, ever recover financially."

As much as I believe things are tilted too far in favor of companies these days, it is true that over-harsh regulation on business can be very bad — particularly when it becomes harder to start new businesses.

But I agree with your sentiment. To a large degree the government is as good as we make it.


This is an important point, I think. If our ultimate goal is to incentivise good practice, we have to deal with a range of problems from simple ignorance of good practice or what the law requires through to gross negligence or "wilful ignorance" situations. If you have a business that is acting in good faith but makes an error in judgement or isn't aware of some specific regulation, there is no sense taking a punitive stance. Obviously if their actions have caused damage to another party then compensation may be appropriate, but otherwise constructive engagement is likely to work best. On the other hand, if you have a business that is knowingly and deliberately acting in bad faith, there may be little point in being constructive, and the penalties need to be significant enough to force them to change (or their business to fail).


> I wouldn't trust a governmental agency to aggressively fulfill it's mission. My impression is that in general they're too much at the mercy of politicians.

It is good when a governmental agency is at the mercy of politicians. As the electorate, we are responsible for the politicians we elect.


> a more effective strategy is to enact legislation that makes companies liable under civil law

Agree. But there is still a role for a public prosecutor to (a) collect complaints and (b) launching suits on behalf of the public, for when violations occur and no private action ensues.


You're screwed either way. If a government agency enforces it, opposition politicians attack it. If courts enforce it, courts are attacked instead and opposition politicians push for changes int eh law. the regulated industry will cheerfully spend shareholder money to drive public sentiment in a favorable direction. In US politics, money dominates everything else.


I'm not knowledgeable about how the US administration works, but I think it's a universal principle that when you write laws and regulations, it generally has to come with means to inspect and check that the rules are not violated.

It's especially difficult with anything digital. Do you really want to rely just on whistleblowers?


>I suspect a more effective strategy is to enact legislation that makes companies liable under civil law, with private citizens empowered to sue.

There's no chance that will happen. It should, but it won't. By taking away the ability of individual citizens to sue, and vesting that power in a government agency, it protects the companies while empowering the regulators. The end result is fair toothless regulation and more donations/lobbying dollars flowing from the private sector to the public sector/politicians.

A number of anti-spam laws ended up this way.


> regulation hampers growth

Is there proof of this? Some countries like Germany and Sweden are both highly regulated and have fastest economic growth in the West.


Bad regulation hampers growth, good regulation induces growth.

As I see it, the problem in the U.S. is that the current political system is incapable of producing good regulation, partly because of the deeply flawed voting system and partly because of a collective belief that government cannot be effective (producing that exact outcome).


Who's growing is the more important question. Regulation benefits the already big.


Some regulation is bad, but there are plenty of good examples. Regulation on the standards like GSM and use of SIM cards helped new entrants into the European markets. Net Neutrality, standardized electrical systems, etc have benefited everyone large and small. And of course, anti-trust regulations in the EU has not benefited Microsoft or Google.


> standardized electrical systems

Except for the damn plugs!


That's truism that I don't think is accurate. You're assuming all regulation is complex and that this can directly drive administrative costs in linear fashion.


> Regulation benefits the already big

That generalization is pretty obviously wrong. The vast body of antitrust regulation is designed precisely to hurt the "already big". Regulations around net neutrality would also fall into the category of helping the little guy far more than a big player. Regulation certainly can hamper smaller businesses, but it really depends on how the regulation is designed.


> Regulation benefits the already big

"Regulation" isn't guaranteed to do anything. Some regulations benefit already-big, some harm them. Some have no effect at all.

Lack of regulation is guaranteed to benefit already-big, though. Unchecked corporations become oligopolies or monopolies.

Free markets (meaning free movement, not free of regulation) and consumer protection are impossible without regulations, though.


>Lack of regulation is guaranteed to benefit already-big, though.

Not neccessarly, it depends on how educated market participants are. Shouldn't a world that becomes more and more informed, connected and educated not become less regulated?

>Unchecked corporations become oligopolies or monopolies.

But not forever, less regulation also means more opportunity for competitors. And educated customers would be aware of the situation and just ignore the monopolies if they can buy somewhere else.


Yes, all regulation hurts economic growth. The mechanism is simple: regulatory compliance is a cost. If it's a fixed cost like most regulation, it becomes a barrier to entry. If it's a variable cost, it reduces labor productivity.

HOWEVER, some regulation may improve net economic growth by improving the functioning of markets such that it overcomes the costs that compliance imposes. Therefore the discussion of regulation cannot be divorced from the quantitative and qualitative merits of the specific regulation proposed, and therefore must be considered on a case by case basis. The idea that regulation can be proven good or bad in aggregate is a fools errand.


  Yes, all regulation hurts economic growth. The mechanism
  is simple: regulatory compliance is a cost.
I've heard it said [1] that banning cigarette advertising is good for cigarette companies, as previously they had to compete to outspend one another to avoid losing market share. In other words, legislation moved competitors from an expensive Nash equilibrium to a cheap global optimum.

Of course, this is a rare example, and a questionable one; it assumes tobacco advertising does not increase the total market for tobacco products. Given that tobacco companies lobby against such advertising bans, they clearly don't see such regulations as a net benefit for them.

[1] https://en.wikipedia.org/wiki/Prisoner%27s_dilemma#In_econom...


Most cigarette brands are owned by a couple of huge tobacco companies like Philip Morris (https://en.wikipedia.org/wiki/Philip_Morris_USA#Brands) and British American Tobacco (https://en.wikipedia.org/wiki/British_American_Tobacco#Curre...). They are barely competing, just trying to attract different crowds under different brands and to make a premium on luxury brands.


> all regulation hurts economic growth [...] some regulation may improve net economic growth

What definition of "economic growth" are you using for your first claim? It doesn't seem to match the common expectation. I'd assume that net growth is what we care about in the context of public policy.


>> Is there proof of this?

> Yes, all regulation hurts economic growth.

What about the proof of that? What basis are you making the assertion on?

It seems like some regulation can promote growth. For instance, regulation on product safety increases consumer confidence in the safety of those products and creates a broader market for them.


Upon careful reading of the comment to which you are responding, I found that it neatly addresses very precisely the keen, sensible, and reasonable point you have raised. Perhaps you have had an experience significantly from mine?


That comment assumes we can all agree upon a textbook definition of 'economic regulation'. An economic regulation is generally defined as any legislation or administrative act that impacts either pricing a good or entering a market. If you follow that strict, conventional (in economics circles) definition, it's tough to argue with the comment.

Outside of textbooks, economic regulation has a much bigger meaning such that a statement like that would be highly confusing and inflammatory. Outside of textbooks, economic regulation can refer to anything from health/safety legislation to antitrust legislation to taxes/tariffs/freer trade. Economic regulation = increased costs only works if you use a conventional definition of regulation...


I understand where you're coming from and what you mean. I even understand why some people, reading the original comment, would stop after the confusing and inflammatory statement and not finish reading the rest of the fine comment.

It's possible that such people might be best served to read the rest of the comment and cogitate for a moment before objecting.

In this context, I found that the parent comment covers health, safety, anti-trust, and more under the umbrella of improving market function. I understand that others may have arrived at a different understanding of the comment.


That's a blanket statement without any evidence.

Its like saying static checking always hurts productivity because you need to be compliant with the compiler. What if complying with regulations reduces costs? Why does it have to necessarily increase them?


Your analogy is spot on, because both are true. Regulation increases costs, and it may also reduce costs. Static checking increases costs, and it may also reduces costs.

The whole point of what I wrote is that the net benefit depends on the tradeoff that is made between the two costs and that tradeoff will depend entirely on context and details, which excludes the possibility of coming to an aggregate conclusion (such as static typing is always better, or regulation is always good).

For example, I would never do exploratory data analysis using Scala, but I would also never write a production system in R. In each scenario it is useful to separate the various costs and benefits and weigh them against each other, as opposed to taking an ideological stand like "STATIC TYPING OR DIE".


This guy economics.

But seriously I'm tired of so many armchair Libertarians spouting their one semester of economics understanding. Most of them never read past your first statement.


I point out there are three armchair points-of-view that are possible here which are, in my experience, equally tiresome:

(1) All regulation is bad. (2) All regulation is good. (3) All regulation which is net positive is good.

(3) may be more complex, but it can be an armchair position because it excludes the political and moral questions of who wins, who loses, and who decides.

Importantly, (1) and (3) can also both be non-armchair points of view. It really depends what arguments are made in their defense. A libertarian, fully cognizant of the points made by the GP, can still believe (1) if his basis is moral grounds. Likewise (3), if it is defended on utilitarian grounds. Sure, if a libertarian argues (1) on an economic basis, it suggests a certain ignorance we could call 'armchair'.

But I would argue that we largely ignore moral arguments. So, even a libertarian arguing (1) from an economics standpoint may actually be doing so not because he believes it, but because he believes his audience will reject his argument from his core belief; and I would be hesitant to call him 'armchair' (even though he is annoying) just as I would be hesitant to call a utilitarian 'armchair' if he avoids the political questions. Both can be wrong without being wrong.


I wonder how these things were before. Maybe there was a bit of data sharing but at such small scale and speed that it didn't really matter.


This shouldn't be in the hands of a regulatory agency. Serious privacy violations should be legislated and treated as outright crimes, not just regulation infringements.

Violating basic civil rights should not just result in the off chance of having to pay a fine. People like Zuckerberg belong in jail way more than the average drug dealer.


> This shouldn't be in the hands of a regulatory agency. Serious privacy violations should be legislated and treated as outright crimes, not just regulation infringements.

IHIPAA is largely in the hands of HHS as a regulatory body, and yet violation of its privacy mandates (including those HHS is empowered to detail by regulation) are “outright crimes” as well as also potential civil offenses, not some distinct and lesser category of “regulatory violations”.

Your argument seems deeply grounded in an incorrect assumption about what it means to involve a regulatory agency.


Why wouldn't HIV status be protected by HIPAA?


HIPAA only covers health care providers who bill for services, such as a doctor’s office or insurance. But there might be other laws that apply.


> Why wouldn't HIV status be protected by HIPAA?

Because Grindr is not a healthcare provider (doctor, nurse, hospital, etc.) or a health insurer. HIPAA doesn't apply to social networks, or otherwise (e.g.) Twitter would be liable if you wrote a post announcing your own HIV status on their service.


It depends on entity that's sharing being a "covered" entity.


They also scrape your clipboard aggressively...


Source?


At least on iOS, proof by demonstration. Install the app, turn on clipboard sharing, copy something on your mac, then open the iOS app. You'll see the "Pasting from your Mac" dialog pop up pretty frequently.


"Grindr's users may not be aware that they are sharing such data with them"

I believe that to be an understatement!


I see they have some instructions there for how they did it. Any chance anyone could make a small instructive tutorial, so we can start replicating this process for other apps as well?

Then we can put everything in a giant repo and make it publicly accessible information.


Looks like the repo got deleted. Can't find an arhive.org version either.

According to a friend, an article he saw earlier also got pulled. Are Grindr attempting to do some damage control?


Image of the data structure.

https://i.imgur.com/hstbZio.png


Does anyone actually find this surprising? It's fairly normal to send user data to third party analytics providers. If you want to know which, check your terms of service.


"shares" sounds too friendly.


Please don't use allcaps for emphasis in HN comments. This is in the site guidelines: https://news.ycombinator.com/newsguidelines.html.


fixed.


Great, thanks. I've detached this subthread from https://news.ycombinator.com/item?id=16736341 and marked it off-topic.


None of that data seems to be "private" according to Grindr's privacy policy: https://www.grindr.com/privacy-policy


So there's lots of talk about how we're going to regulate/manage data protection going forward but what are we going to do about the stuff that is already out there? I mean HIV status is a pretty toxic thing to just be floating around. It doesn't seem that we can even be sure who has this data and who doesn't.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: