Hacker News new | past | comments | ask | show | jobs | submit login
How People Can Get You To Do What They Want (skorks.com)
32 points by askorkin on Aug 7, 2009 | hide | past | favorite | 13 comments



I completely agree - starting dialog about differences between your instinct and your instructions is very healthy. The trick is to create an environment where that can happen. I'm sure a lot of managers don't appreciate employees that do things like that, even if it's not done in an argumentative way. Anyone have any thoughts on how to create such an environment?


Having done the managerial thing in a few companies, I would say that a lot of employees are reluctant to question managers, even if they are encouraged to do so.

I've got several anecdotes of trying to press my team to question technical/design/administrative decisions but for the most part, they were quite. Yes, there would be one or two people who would do so (to my appreciation), but they were the exception.

To generalize, I would say that most people would rather take instruction, follow the rules and not question authority - Its easier.

As far as creating an environment to encourage constructive feedback - Possibly creating a system that provides anonymity might increase the level of feedback.

Though in my opinion, the idea of questioning issues of instinct vs. instruction have to be an intrinsic part of a companies culture. Tough to implement if not there from the beginning.


I believe it all starts with the right people. If you have the right people in your team, i.e. open minded, willing to learn and improve etc., they will naturally tend to perpetuate the right kind of environment. It gets a lot more difficult when you have to start with people who are not necessarily open to this kind of thinking. In this situation you have to first build a whole lot of trust among the team members and management before trying to slowly introduce ideas like this. Often projects finish before such a level of trust can be established.


Experiments like the Milgram experiment would not be considered ethical today because of the mental effect it has on the test subjects afterward (feeling bad about themselves for not stopping). I'm undecided if that's good or not - It just seems to have exposed a truth about human nature, which remains true even if you don't expose it. But, if it a lasting psychological impact on the test subjects... that doesn't seem desirable.

The point of the article is correct from a business standpoint as well as a moral one. Too often we fail to question processes and procedures because "that's the way its always been done."


The ethical concerns about the Milgram experiment are too varied to go into much detail here (if you're interested, http://books.google.com/books?id=U44OAAAAQAAJ&pg=PA193 ) but the short, relevant, version is that the psychological impact was positive, not negative. 84% of those surveyed (out of 92% who participated) reported that they were "glad" or "very glad" to have participated in the experiment.


Very interesting. I was just going off the ethical arguments and data raised from a course I took while at the university where we studied the Milgram experiment.

I guess that's what I get for just going along with someone who was an "authority" on the subject.


Who's to say it has to have a negative effect? If you told them it was fake after wards and then made them aware of how susceptible people were to that sort of thing they might be better able to resist next time they were in such a situation.


You may also be interested in The Ethics Of Using Medical Data From Nazi Experiments: http://www.jewishvirtuallibrary.org/jsource/Judaism/naziexp....

Also submitted here: http://news.ycombinator.com/item?id=747561


First, I don't buy the post's statistics. 65% of the subjects administered every shock in the experiment. But

  when we believe that someone knows more than us about a 
  subject, they can get us to do what they want most of the 
  time (or 65% of the time if you can believe the experiment)
mis-parses the probabilities.

Second, I think the "(65% of) people obey to authority despite what should be their better judgment" overfits the results of the experiment. Perhaps people simply listen to scientists during experiments?


The point is that the listen to scientists because they are wearing white coats, despite the (play-acted) agonized yowls from the unfortunate victim, from which the experimental subject can clearly infer that they are causing suffering. If you think that tuning such information out is normal or sensible, I suggest some self-examination.

Note that in the Milgram experiment the scientist figure never tells the subject it's safe or makes excuses for the degree of shock administered, they just insist that an external authority 'the experiment' requires the subject to continue.


Your description doesn't quite match the one in the article:

The researcher tells you to keep going, and that the shocks will cause "no permanent tissue damage" to the Learner.


True, sorry about that - I was thinking of the specifications for the original experiment at http://en.wikipedia.org/wiki/Milgram_experiment But still, it comes down to the question of whether you believe this abstract assertion, or the person who is (supposedly) on the receiving end of the shocks.

The disturbing issue raised by Milgram's experiment is how you can apparently persuade a majority of people to do just about anything, as long you have an appropriate authority figure around to affirm that it's safe/ legal/ necessary. With things like waterboarding and suicide bombing being stples of the news in recent years, this strikes me as a serious problem.


It seems Milgram did not follow the experimental specifications exactly as given on Wikipedia. The "tissue damage" exchange is mentioned further down on the Wikipedia page and comes from Milgram's own anecdotal recollections of the experiment.

If this is the case, it seems that he improvised rationalizations as needed to get the subjects to continue administering shocks. This strikes me as rather sloppy on his part. (...not that I think the numbers would have been all that different had he not done this.)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: