Hacker News new | past | comments | ask | show | jobs | submit login

What motivates extremely talented tech workers to devote their limited working years to making the world a worse place to live in?



I almost guarantee that these engineers have a very different value set than you. They could possibly see this as granting law enforcement legitimate and authorized access to collect evidence needed to prosecute real criminals. In their minds abuses of the technology they make is likely the illegal part not the existence of their tool itself.

Most of the responses I'm seeing here are about money and while that is probably a factor at some level, I like to believe that people need to have a lot of pressure put on them by society for other reasons to give up their personal morals in exchange for money.

"No one is the villain in their own story".


I asked the question in earnest, so I appreciate your reply. I think you're probably right.


So far every job orientation I've done has been 10% information useful for doing the job and 90% making me feel good about the job. You probably didn't take the job if you didn't feel at least a little good about it. If you don't feel good about the job anymore, then you probably quit. This pattern makes sure that the organization is mostly full of people that support it.

Most of your feelings aren't rational conclusions based on sound analytical reasoning, and if your day-to-day is just making widgets for other widgets, you probably aren't often forced to update them.


I will add to this.

If you look at us biologically speaking, we are herding creatures.

That means the majority will go along with what the leaders want.

While the Milgram experiment was skewed, the reality is most people will shock you to death just because an authority told them so. And if you ask them at the end if what they did was right, they will say yes, or they wouldn't have done it.

Being a decent of German Jews, I normally think: "Who of the people that I know wouldn't turn me in."

The sad reality: very few. It's by design. Herds don't work when everyone is trying to go their own way. There will always be individuals who eschew herd. Biologically, they generally die, but a few find new environments or start their own new herds. Of course the herd problem then starts over.


What leads well-off Americans, whose safety and comfort is guaranteed by the constant efforts of law enforcement, to conclude that legal efforts to make law enforcement more efficient would make the world a worse place?


Ethics. The amount and scope of information collected by things like Palantir Gotham is beyond invasive. The fact that a police officer can type in your name and know where you were driving is disconcerting. It is about as Big Brother as you can get and not the kind of world I would like to live in. This question is a slippery slope, where people become more and more accustomed to less privacy. If you think this is justifiable, I would highly suggest reading up about the the Stasi in East Germany or read 1984.


This exact issue was pointed out many times with automated license plate scanning came into existence. Is it really Palantir's fault for making software to aggregate the data? In my view it is our fault as citizens for allowing the government to collect such data in the first place.

Any other company with enough developers and data scientists can do what Palantir does. It's the massive amount of information their software uses that is the concern. But that data is collected by our government, and so we have legal recourse to it, either under current law or by passing new privacy laws.


That’s like asking if it’s really a murderer’s fault for pulling the trigger of a gun manufactured by someone else. Yes, it absolutely is their fault and no, “if they didn’t, someone else would” is not a valid moral excuse, ever.


Who is the murderer in this analogy? To me it's the government misusing such a tool.


Palantir. The government is the gun manufacturer collecting all the needed data in various places.


Why though? Palantir isn't the one using the tools for ill. The government is collecting the information and using it, possibly, to violate civil liberties and constitutional rights.

What can Palantir do to you?


That's not what Palantir Gotham does. It's much closer to a sophisticated database and Tableau than some sort of big-brother invasive nightmare.


Paying attention to world history, for one.

For another, I've personally seen my work get turned from a human safety product in one market into a direct instrument of human oppression in another market. Lots of tech is just a tool, which can be used for good or evil. The thing is, once good people see what is possible, it provides them an easy route to doing evil which they might not have had before.


I'm sorry to hear about your personal experience; I've been in a similar situation myself, related to voice assistant technology. However, I don't think either of our experiences necessarily imply that Palantir is evil. For example:

Typewriters can be turned to evil. IBM showed this to the world with their WWII-era Nazi collaboration. That doesn't mean the world should abandon typewriters, or that no self-respecting engineer should work on typewriters, or that they would be necessarily used for evil by the law-enforcement arm of a democratic society.

Should police agencies collect and store real-time location data for every private car? Probably not. I'd support legislation to restrict such practices. Should Palantir help police officers sort and present the data they collect, in general including not only LPR data but also criminal records, known associates, and vehicle information? I'd say yes.


Typewriters can be turned to evil. IBM showed this to the world with their WWII-era Nazi collaboration. That doesn't mean the world should abandon typewriters, or that no self-respecting engineer should work on typewriters, or that they would be necessarily used for evil by the law-enforcement arm of a democratic society.

IBM's relationship with the Third Reich was not based on the mere supply of typewriters; such a suggestion reduces your credibility on this topic to zero.

https://en.wikipedia.org/wiki/IBM_and_the_Holocaust


>IBM's relationship with the Third Reich was not based on the mere supply of typewriters; such a suggestion reduces your credibility on this topic to zero.

"Punchcards can be turned to evil."

The specific type of dual-use technology is irrelevant to my point.


Misrepresentation of facts corrodes discourse.


Defending American liberties often result in uncomfortable situations. For instance, defending free speech. To defend free speech, we often find ourselves having to seemingly defend truly awful things (depending on your personal ethics), because the principle of free speech overrides our feelings on a particular bit of speech; hate speech aside.

It’s not like the people against mass surveillance all want their kids to live in unsafe neighborhoods, or be blind to terrorist plots. But everything must be balanced within our set of liberties — and the American way of due process, etc, is purposefully skewed to protect individual rights and liberties.

This is why gun rights is such a hot button issue — some would like to see stricter regulations, because they feel unsafe sending their kids to school in a world of mass shootings. Others feel those restrictions place an undue burden on individual liberties afforded by the constitution.

When it comes to law enforcement, there is a lot of abuse of Power present in the system, which disproportionately affects people of color. Thus, there’s a very valid argument to be made that some of this tech is too dangerous in the hands of law enforcement, despite how much people of all political stripes want their families to be safe.

My aim here is to illuminate the differences in the arguments. I’m not trying to force my (admittedly liberal) point of view on the conversation, because HN isn’t really for politics. My goal here is to show how, despite politics, people can all desire safety for their family, and personal liberties, but have honest disagreements about the best way to achieve that.


Law enforcement is not without bias, and just because they’re using discretion for you today doesn’t mean they won’t use it against you later.


Quite simply the belief that:

1) these efforts do not guarantee their safety and comfort (though others might, these don't) and

2) that even if they did, the ethical price of hunting down undocumented immigrants is inexcusably high. That in the same way that you wouldn't behead someone for speeding, you shouldn't separate families and cause people to live in the shadows simply for having the privilege of cleaning your hotel sheets.


Given the persistant patterns of unprofessional amd unethical behavior amongst law enforcement, it's understandable that people are against giving them access to more powerful tools that have great potential for abuse.


An understanding of the inevitability of abuse of unchecked power.


$, like someone else has said, but I also would guess that many of the engineers do not have a clear idea of the overarching functionality of the application.

I'm sure internal communication about the product is extremely positive, with phrases like "improved law enforcement accuracy by XX%, decreased customer costs and time by YY," so some may truly believe they are doing something that everyone would agree is good.


Somehow I have a feeling that nearly all the engineers working on it have a clearer idea of the overarching function of the application than you do.


Would you rather tech companies not be trying to help law enforcement or our military, and just leave them to their own devices?

I have my answer, but I don't think it's nearly this black and white. Palantir probably does a lot of good, just like all the big tech companies, even though it's the bad stuff that mostly makes the news.


> Would you rather tech companies not be trying to help law enforcement or our military, and just leave them to their own devices?

Yes, I'd rather not aid law enforcement in spreading disinformation and targeting journalists, for example [1].

Though I do agree that it's not as cut and dry as "Palantir is evil".

[1] https://www.thetechherald.com/articles/Data-intelligence-fir...


I mean, how many of us work at Microsoft or Amazon? Companies currently in a bidding war to build the "War Cloud".


Why a worse place? In this particular example, Palantir is used by the cops to run investigations; it is just a more convenient user interface and analytical engine to already existing data sources. I don’t see anything particularly unethical here.


> just a more convenient user interface

This. A difference between a rock in a sling and a nuclear bomb can, hyperbolically, viewed as a better UI for rather similar goal.

Does it explain the things? Do you see how quantitative differences become qualitative?


I'm sure if you had a shred of empathy in your whole body, you could think for about five seconds and realize that maybe not everyone else had the same system of values as you.


Is the work-life balance at Palantir better? Read on HN that defence companies have much better pacing.


Most defense contracts require 40-hour work weeks. A company can't ask you to work more than 40 hours typically or be in breach of contract (being unfair to the competition who bid on the contract). So yes, when I worked on DoD contracts I was in at 7am, out at 3:45pm.


In general, yes, defense contractors are better. But they also have a old school thinking about compensation and equity, so it’s a trade off. Not sure about Palantir specifically though.


$


Surprisingly, at least for the Bay Area, the money doesn’t even seem to be that good: https://www.levels.fyi/?compare=Palantir,Google,Facebook&tra...


It's making it worse until something bad happens to someone you value.

If Obama's daughters (or Trump's if you are on the other side) were killed in some horrific racist incident, wouldn't you approve the usage of such software if it was the only hope of getting the perpetrator? Or would you say "as much as I am saddened, I cannot approve such massive invasion of privacy".


Or would you say "as much as I am saddened, I cannot approve such massive invasion of privacy".

Yes, I would take that stance every time, and twice on Sunday.


Funny, you seem to be working on systems very similar to Palantir (twitter handle on HN profile):

https://twitter.com/FogbeamLabs/status/1086757478312960002

What safeguards do you have in place to make sure that personally identifiable information of customers that companies using your tech have is not aggregated and pulled together for nefarious use? For example very targeted lead sourcing or targeted advertisement?

Oh, none, you actually advertise that you are mining all the databases for:

> Prospect and identify leads

But I understand that it can be hard to see certain things when your income depends on you not seeing them.


At the end of the day, the only aspect of what we do that is really anything like Palantir is that we build a search index, using Lucene. That's it, an inverted text index. There really is no meaningful way to know that the data being put in is PII, or to regulate how the orgs that use it, do so. And even if we did put in any such safeguard, all our stuff is Open Source, meaning anybody could rebuild it without the safeguard, and we'd be none the wiser.

The differences between us and Palantir then:

1. We don't pitch our software to intelligence agencies, law enforcement, etc., or encourage it's use for these kinds of ends. But we can't specifically block those uses, or we'd be in violation of the OSD.

2. We've been very public with our unwillingness to embrace working with intelligence agencies and the like. See, for example: https://www.wraltechwire.com/2014/04/30/why-a-triangle-tech-...

3. Everything we do is Open Source, meaning that at least the public can take a look inside and see what's going on... modulo any changes a given end user organization makes and keeps private.

4. Our technology is positioned primarily for internal knowledge management / collaboration use inside organizations. But, again, we have no means, legal or technical, to stop somebody from using it for other purposes. And even if we did, they could just download Lucene, ManifoldCF, blah, blah, etc., and build up their own Nefarious Indexing System.

But I understand that it can be hard to see certain things when your income depends on you not seeing them.

There is nothing in this regard that we "don't see". Taking your argument to it's logical conclusion, even a worker mining sand somewhere, to use to fabricate silicon chips, which can be used to power computers, which can be used to run privacy violating software, is "guilty". I don't think I need to point out the absurdity of that position. Furthermore, if we really just cared about "get all the money at any cost" we would have immediately jumped at a chance to talk to In-Q-Tel and the possibility of juicy, rich contracts supplying the CIA and their brethren with technology.


> There really is no meaningful way to know that the data being put in is PII, or to regulate how the orgs that use it, do so. And even if we did put in any such safeguard

Since you advertise that your tech is suitable for lead sourcing, you obviously don't see anything wrong with mining and linking databases of PII information, as long as it's done by "the good guys".


Since you advertise that your tech is suitable for lead sourcing

We don't. The line you quoted above is from my LinkedIn profile where it's describing my responsibilities as founder of the company. So, of course, part of what I do is prospecting and identifying leads. All companies do that, I don't think anything we do falls into the "nefarious" range. We aren't using retargeting or buying user information from data mining companies, etc. Our prospecting basically starts and stops with Twitter and LinkedIn... Not exactly spook stuff.

And while I can respect that some people take things to such extremes that they can even find mundane things like advertising unethical; I'm pretty comfortable with my own sense of ethics and our attempt to do the right things. In either case, attempting to draw any parallel between us and Palantir is an exercise in absurdity. Notice that there's no HN top-page stories titled "Fogbeam Lab's Secret User Manual For Cops", etc. :-)


I apologize for reading the linkedin line in the wrong key.


No worries. I understand where you're coming from. And believe me, I've spent a not inconsiderable amount of time thinking about these issues. I tend to look at raw technology as being "ethically neutral" but it does bother me that there doesn't seem to be any way to truly ensure that tech is only used for noble / beneficial ends. But I don't feel like I can let that stop me from working on tech in general... the only other alternative seems to be to turn Amish or something. And somehow I'm not quite comfortable with that.


>If Obama's daughters (or Trump's if you are on the other side) were killed in some horrific racist incident, wouldn't you approve the usage of such software if it was the only hope of getting the perpetrator? Or would you say "as much as I am saddened, I cannot approve such massive invasion of privacy".

The latter, every time. The nature of principles is that they do not change with changing circumstances. I find your example extremely unpersuasive.


Are you alleging that Palantir enables or encourages "horrific racist incidents"? Could you be more specific?

EDIT: I see what you mean. I had mistaken the meaning of OP. Please ignore this comment.


I think the reference here is to the problem of fatal shooting of minorities by police in the US. The OP is attempting to draw attention to the problem by an exaggerated situation in which the former first daughters are involved in such an incodent.


They're saying the opposite. They're saying that Palantir is helping catch violent racists after they commit crimes.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: