There are sites (like wwe.com) where after you have successfully located the preference to opt out from everything it shows a "processing" screen which is stuck at 98% for about a minute. But accept is "processed" in an instant. Another dark pattern is showing you have opted out but some sites cannot receive opt out requests through https which is written in small fonts. By doing this they have successfully targeted security conscious people. I know this is not something major, but still, how do these people sleep at night?
edit: I looked it up and wwe.com uses TrustArc, which seems to be a shady org certifying privacy. Mired in controversy, they have even settled a case with FTC in 2014 for $200,000. I'm guessing when push comes to shove and EU actually decides to prosecute they will pay a similar amount. I bet that amount is already in there books, set aside as "future risk management" or something like that. Just the cost of doing business.
> There are sites (like wwe.com) where after you have successfully located the preference to opt out from everything it shows a "processing" screen which is stuck at 98% for about a minute. But accept is "processed" in an instant.
How desperate does one have to be to work as a developer on projects like this?
you assume the people doing this see a problem with it. They could easily...
* just be a nihilist and not give a damn
* Think its fine because they are encouraging people to do something they see as good.
* Think its fine because 'its just ads'
* Think it's someone else's responsibility to make decisions about ethics (as an engineering prof, I see this all the time...students who say engineers' jobs are technology not ethics/morals)
> Think it's someone else's responsibility to make decisions about ethics (as an engineering prof, I see this all the time...students who say engineers' jobs are technology not ethics/morals)
The problem is that they're correct in a practical sense.
The company they work for might get fined, but all the engineers see is the performance review. This system encourages engineers to think of compliance and morality as someone else's problem because: it is.
Many will justify this as "if I don't do it, someone else will" and again: absolutely correct.
Pass a law where engineers themselves may face fines or jail time for implementing immoral code, and they will suddenly discover a keen interest in the ethics of what they do.
> The company they work for might get fined, but all the engineers see is the performance review. This system encourages engineers to think of compliance and morality as someone else's problem because: it is.
or lowers the cost because at least some ethics are the luxury of the well-off. (sure, worrying about feeding family might not lead you to kill someone but it might let you make it difficult for users to opt out of privacy invasion)
Morality (definition left as an exercise) is something you have to be able to afford. Get hungry enough, get tired of sleeping on cardboard enough, get tired enough of restrictions imposed on you by society for stupid mistakes you made while out of your mind ... and morality is a luxury you can't afford. (In many countries, that luxury is permanently out-of-reach for the great majority.)
Now add to that the constant reminders that morality doesn't stop a lot of people at the top from immoral behavior. Banks, politicians, Wall-Street, celebrities, billionaires etc. are in the news all the time for pulling shady shit. When they get caught, they say OOPS! They hear: 'don't do that any more, where people can see you' and they're free for another round.
Many people tend to adopt behavior that's rewarded. It takes a strong moral compass; some people never got magnetized. Show people a society that rewards moral behavior (they exist) and they might go there. (Some countries have low recidivism rates because of how well they treat people in prison.) If people can afford to move, they might. Else they may just say screw it, XYZ throws toxins in the river, I'll do it too. Choice: track people or go shoot 'terrorists'. Hmmmm.
I see this argument a lot, I grew up in such an environment. But being moral is really not a luxury because there are several very rich people who are immoral. Being moral is easy or hard depending upon your means, but so is everything else. At the end of the day you have to think about what you are doing and you and solely you are responsible for this.
Also this whole argument is distracting from the fact that this is TrustArc we are talking about. Its an american company with 340 employees and over 20 million american dollars in annual revenue. When a company out of India does this same thing we can argue impact of means on morality then.
You're assuming it's developed in the west. They could've just outsourced that work, too. Here in India, I don't see too many developers who give a damn about user privacy. From privacy international's report on menstruation apps stealing data [1], two of the worst offenders (Maya and My Period Tracker) were from Indian companies.
This argument doesn't hold up if you refer back to all engineering ethics dilemmas of the past. If all these engineers could have easily gotten another job, why did they stay? Why did Boeing engineers not speak up, why did Firestone engineers not speak up, the list continues. Not everyone has the same moral compass and that is the bottom line.
The best thing is that, once you waited for the time, it fails:
> This page transmits information using https protocol. Some vendors cannot receive opt-out requests via https protocols so the processing of your opt-out request is incomplete. To complete the opt-out process, please click here to resubmit your preferences.
Given that very nearly the entire software industry has either acquiesced to or actively abets practices like these, the real answer is generally "desperate enough to be seeking a job in your field".
I suppose someone could also structure the requirements for the task in such a way that it would have this result without explicitly saying it would have to have this result.
Given that the accept is our default just send off the accepted token = ok and don't wait for the response.
Send array of opting out info and then wait for all responses are ok so you can tell the user that it worked to opt out, the user must see opting out worked! response before going on to next step, because for GDPR reasons we need to tell the if there is a problem so they can try again!
- but that means we will be stuck at 98% for about a minute!
>I know this is not something major, but still, how do these people sleep at night?
Easily and without issues. Humans are very good at making sure they do not feel themselves to be evil. A mass murderer will blame everyone except themselves or rationalize their actions as just.
Things which come to mind in 30 seconds:
"The regulation is draconian and it is just to fight it in any way possible."
"Our business helps people and working around this helps our business and thus helps people."
"If people really wanted and weren't simply mindlessly clicking buttons this won't stop them so we're actually helping user's enact their will."
"We put all this effort into the business, it's evil for the government to interfere for wishy washy reasons."
> There are sites (like wwe.com) where after you have successfully located the preference to opt out from everything it shows a "processing" screen which is stuck at 98% for about a minute. But accept is "processed" in an instant.
This one drives me nuts! It's just such a brazen and blatent piss-take - "you won't let us hoover up your data and sell it to everyone we can? Then we'll punish you".
> it shows a "processing" screen which is stuck at 98% for about a minute
Proximus [0], the partially state-owned and largest telecom provider in Belgium, uses this pattern too.
Additionally, on mobile, scrolling through the cookie-usage options automatically selects the maximum invasive option. The 'scroll-touch' is registered as a regular touch selecting the option.
This is also Truste/TrustArc [1] that GP mentioned, and they do it on every website they're installed.
Looking at their Javascript code and Timeline in the inspector it is 100% fake. It is all implemented using setTimeout and there's no communication with the server in the meantime.
Shows a popup, saying "we collect your data yadda yadda yadda". Then there were two buttons. One to agree to that. One to manage it. But clicking on the manage button just took a user to screens and screens of garbage information mainly listing the companies that used the information. Without any option to opt out (you could contact them to opt out, I assume individually). There was a button (if you drilled through the screens) which seemed to imply that it would link to a page that allowed opting out, but all it did was take you back to the first screen of the popup. Unreal. Somebody has thought about that; absolute cretins.
They've changed that so that now there is opt out toggles (which are obviously all split into groups and are all on by default and so on), I assume because of someone in legal tapping them on the shoulder?
thank you, there are other sites[1] (mostly non English) that still can get out of it. I'm trying to manually block it, but it uses random ids. Is there a method for that?
edit: I looked it up and wwe.com uses TrustArc, which seems to be a shady org certifying privacy. Mired in controversy, they have even settled a case with FTC in 2014 for $200,000. I'm guessing when push comes to shove and EU actually decides to prosecute they will pay a similar amount. I bet that amount is already in there books, set aside as "future risk management" or something like that. Just the cost of doing business.