Hacker News new | past | comments | ask | show | jobs | submit login
How Deaf People Think (todayifoundout.com)
107 points by srean on Feb 6, 2011 | hide | past | favorite | 50 comments



Long-time lurker here - this is one of the first articles that I've felt compelled to respond to. I have a unique perspective on deafness and language.

I was born profoundly deaf (so deaf that audiologists proclaimed me as the deafest child ever - what an honor). I learned ASL when I was six months old. To preface - I was incredibly lucky to have parents who learned an entire new language for me (they were immigrants, so it was no easy task for them) and noticed my deafness early. I also got a cochlear implant, which enables me to hear speech fairly well. Over the years, I've learned how to speak and listen well enough to converse with most people. I've also managed to maintain my fluency in ASL - it's a very important part of who I am.

According to my mother, I was able to express my desires (I want orange juice, my diapers need changing, etc.) as early as 8 months. So, I was able to talk to my parents at a different level than hearing infants of the same age.

I have plenty of deaf friends who were not as lucky. Some of them had hearing losses that were not discovered until they were more than two years old. Some of them just had parents who didn't want to or could not learn ASL. Their performance, academic and otherwise, has lagged behind their hearing peers. I find that their thoughts are not as organized as someone who learned language early on (regardless of form), so some of their actions don't have "common sense." Frankly, the benign neglect I see with deaf children is shocking and quite depressing. It makes me grateful for what I had growing up.

This article also made me think about how different my thought process is from hearing people's processes. Even after years of using speech as my main mode of communication, ASL still influences my thoughts quite heavily. For example, I tend to think about spatial matters in ASL - i.e. how a street is laid out. Also, I tend to think about emotions in ASL, complete with facial expressions. On the other hand, with more technical issues such as the law (I'm a lawyer), I'll think in words, even spoken words sometimes.

In all and all, I'm grateful that I was able to learn ASL so early and transition to spoken language. ASL really enriched my life - culturally and mentally - and I'm very glad that ASL was my first language.


Can you tell us more about the cochlear implant? I've never met someone who has one (that I know of).


You can hear what they sound like:

http://www.pbs.org/saf/1205/features/Interactive/channel1.ht...

(If you're doing something dangerous, do not play the sound, it's profoundly disturbing to most people who have full hearing the first time they hear it).

There is a better link I cannot find that shows speech, music (which doesn't work well), and various other things through them.


Neat! I've never bothered to go to these websites before.

Although I should point out that most people today who have cochlear implants have 22 or 24 channels, not one. I have 22 channels in one ear and 24 in the other. (You have to click through to get to these channels though).

As I mentioned before, I never really developed appreciation for music. It's probably a result of never really hearing music as it should be heard. Oh well.


I don't know about profoundly disturbing myself, but the 4-channel sample sure does sound odd, almost creepy. but what I thought was neat-o was that at 22 channels, it was a pretty decent approximation of how Alan Alda "normally" sounds.


Really? Interesting. All these years, I've been wondering "what in the hell do hearing people HEAR?" Good to know that perhaps my hearing is not as drastically different from others as I feared.


Let me try to describe the difference between 24 channel implants vs typical human hearing (as seen in recordings):

You can tell what many people are saying very close to as well as normal hearing says, at least in English. The experience is comparable to a particularly static filled radio connection.

I imagine languages such as chinese would be considerably more difficult, as you miss a lot of the "tone" of the voice, as well as the movement up or down (which they rely heavily on). I imagine it's sometimes hard to tell if people are asking a question or not from inflection alone compared to typical hearing.

Music does sound completely different in an implant (remember, I don't have one, I've just heard recordings), especially classical symphonic music as used for emotive sections of TV shows/movies, or to manage viewer tension. Pop music seems to come across rhythm wise, but you often are missing some of that same tonality above, and so I'm not sure you'd get the same tension/release cycle that's key to enjoying music in the way many typical hearing people do. I say this especially as you will not hear much of the discordant resolving that's so key to the experience.

Learning what pitches your channels are set at could possibly allow you to find particular works which allow you to experience this, but you'd have to talk to a music theory geek to really go further with this line of talk.


up voted for Alan Alda joke


Sure - no problem. Just to give you some context, I got my implant in 1991, and technology has advanced by leaps and bounds since then.

Cochlear implants are quite different than hearing aids. Hearing aids just amplify sounds. Cochlear implants actually directly stimulates your auditory nerve via a surgical implant, mimicking sound. Normally, sound waves pass through your cochlea and stimulate the multiple hair cells, triggering different "sounds" (as interpreted by your brain). Since the cochlear implant stimulates the auditory nerve more directly, once a person is implanted, the person loses all of his or her residuary hearing. Obviously, only people with profound or severe hearing losses are allowed to get a cochlear implant.

My cochlear implant has two components - the inner and external portions. The external portion is the speech processor and it looks somewhat like a hearing aid without the ear mold and it has a magnetic part that attaches to your head. The speech processor converts sounds into electrical signals that goes through a magnet that is attached to my head. The electrical signal will pass through the internal parts of my cochlear implant (the implants itself) and stimulate the nerve, causing me to hear sound. I should add that the conversion process filters out most background noises and focuses on frequencies used for speech. As a result, I can't really appreciate music because with the limited frequency range of the cochlear implant, music sounds all muddled to me.

Of course, that's the technical part. I can hear, but I can't hear anywhere as well as a hearing person. I still have a lot of difficulties hearing in a noisy situation, such as restaurants, bars, etc. I can't really differentiate between certain sounds such as b/ps, g/hs and the like.

I can, however, converse with most people. I have an "accent" because I learned how to speak relatively late - 8 years old. I have met people who got their implants when they were two or three and they barely have an accent.

One thing I should also add - most people who receive a cochlear implant need special training to use it. Since I was one of the first 500 children to receive a cochlear implant after the FDA approved it for children, I underwent intensive speech-and-listening training. I had to learn how to listen since I had no frame of reference for sound. I'm sure you can imagine how hard it is to get a six-year old to sit still and practice listening for hours at a time. It took me two years before I could start learning how to speak. I didn't feel comfortable speaking in public until I was thirteen or so.

Of course, the process is quite different for someone who heard before they got an implant because he or she has a frame of reference. After talking to younger implant recipients, I get the impression that the technology and training methods have improved, so the whole process isn't as onerous anymore.

Hopefully this wasn't too much information and if I misstated something (probably biology-related) I apologize in advance! If you have more questions, feel free to ask!


Do you consider getting an upgraded implant someday, or is that not practical?


You know, I was thinking about this recently. Since the operation is quite expensive (tens of thousands of dollars), insurance coverage is a must. It's hard to get insurance coverage for a new internal component if mine is still functional. Also, today's technology - in terms of the internal component - is not that much better than what I got in 1991. For example, right now I have 22 channels, but the best that one can get today is 24 channels. The improvement is relatively small for the amount of hassle that I would have to go through to get that improvement - surgery, recovery time, etc.

Like all surgeries, cochlear implant surgery has risks. I don't want to take heedless risks, so I'm content with what I have for right now. Although I think I would get the surgery if the implant was 100% internal - no more external parts!


The speech and listening training sounds very similar to what I went through with my hearing aids, called auditory-verbal therapy. I've got severe/profound loss in both ears, but I never really learned ASL since I was diagnosed ~2 and started early. Thanks for sharing your story!


Here is the other link!

http://www.hei.org/research/aip/audiodemos.htm

Has the music through the implant.


I'm so deaf I can't hear much of anything, even with hearing aids. I never learned American Sign Language. I lip-read terribly. I also have an inner voice. I read at a grade level above hearing peers. I understand puns.

How did that happen?

I learned Cued Speech[1]. It's not mentioned anywhere in this article. It's a system that, summed up in a few words, uses signs for phonemes. Because of this, I could learn English within English. And when I took Spanish classes, I learned Spanish within Spanish.

I also dream in verbal words. Even after I learned ASL, I still dream verbally. The words are always quiet, a bit muffled -- this is how I heard things early in life, with hearing aids (I have a cochlear implant now). Occasionally I see captions, as if it's a TV show.

A last note. In the article it mentions little d and big D deaf; the way I learned it is that little d deaf people are deaf people in general, and big D deaf is the Deaf culture and community.

[1]: http://en.wikipedia.org/wiki/Cued_speech


Those born deaf and not taught sign language might not learn a language to think in at all. http://neuroanthropology.net/2010/07/21/life-without-languag...

I walked up to him and signed, "Hello. My name is Susan." He tried to copy that and did a sloppy rendition of "Hello, my name is Susan." Obviously he didn’t know what he was doing. It wasn’t language. And I was shocked. He looked Mayan and I thought, well, if he knew Mexican sign language, he wouldn’t try to copy. That’s not a normal thing to do, even if you don’t know the language.


Those born deaf and not taught sign language will invent their own sign language. There are no human cultures without language.

Spoken (or signed) language is natural for humans, and developed as a form of telepathy. I can get a thought that's in my head into your head over a great distance transmitted over noisy media in a short time with a few words. But the thought in my head isn't in the form of an extremely ambiguous language like English or ASL or whatever I happened to learn as a child. It's the other way around: the thoughts come first.

Check out The Language Instinct by Steven Pinker for a nice overview on the scientific understanding around this.


The book Adam's Tongue by Derek Bickerton chronicles one view of how language evolved in humans and in doing so, evolved humans along with it. It's pretty fascinating.

The book explains how our brains wouldn't really have the "thoughts" you speak of without having had language in the first place. I'm not sure precisely how this fits in to children who can't speak and who were never taught to sign, but my guess is it would introduce considerable mental stumbling blocks.


Good article. One correction, however:

One of the big differences between ASL and many other sign languages is that ASL primarily uses only hand gestures, whereas most types of sign languages, such as BSL, rely heavily on facial expressions and other physical expressions outside of hand and finger gestures.

ASL places a lot of importance on facial expressions and body position to express tense and grammatical concepts. Many signs are a combination of a mouth morpheme and a hand gesture.

If you just do the hand-shapes, fluent signers will probably be able to understand you, but it would be the equivalent of mispronouncing words in a spoken language, or constructing sentences awkwardly.


I agree. Saying ASL only uses hand gestures is like saying speaking only uses vibrations of molecules in the air. The world would speak like those strange text-to-speech animated bear videos.

ASL + Expression: http://www.youtube.com/watch?v=3OL33OEW6QA http://www.youtube.com/watch?v=QmKnQjBf8wM


Yeah, that comment didn't really make sense to me either.

I'm fluent in ASL, and I have a working knowledge of BSL. What I can tell is that ASL is more conceptual than BSL. BSL tends to track the English language more closely. For example, ASL will have one sign that expresses an entire sentence, but BSL will most likely have individual signs reflecting the individual words in the sentence. (This is not always true, of course.)

All of sign languages depend primarily on gestures and facial expressions.


I had two deaf grandparents.

My grandmother lost her hearing to meningitis around age 12, while my grandfather went deaf through his work as a machinist (presumably in his 20s). Her deafness was profound -- the only time I ever saw her react to sound was when balloon burst beside her. His was a bit better and could interpret some sounds with a hearing aid.

Neither of them knew ASL, but were fluent lip readers [They met at lip reading class!] To some extent I believe that my speech was influenced by their deafness. To make myself understood I needed to face them and an enunciate.

Stories aside, this article is a bit strange:

    It is quite common for deaf people, when they are
    dreaming, to not only communicate in their dreams using 
    sign language, but also to communicate telepathically 
    and sometimes even verbally even though they may not 
    know how to speak verbally in the waking world.


I'm a deaf person with some speech training. When I dream, I rarely use sign language. I often notice myself talking verbally more often and sometimes even use "telepathy" to communicate with other dream figures. I think these dreams that I have is a reflection of my desire to be "normal" or at least mask the reality of my disability from myself.

I don't think it's unique to deaf people though, but we might be able to notice the difference better. How would you know whether you're using your voice or "telepathy" in your dreams?


I'm also hard-of-hearing, but went to a school for the deaf for several years.

When rooming with others (say, on school trips, which were fairly frequent), some of my schoolmates would talk in their sleep, and others would sign.

According to my college roommates, I used to do both with some regularity.

As for telepathy: you're probably correct in asking "How would we know?" Perhaps self-reporting after dreams?


I have heard that babies do not realize that their thoughts are private and they assume that thinking by itself is a form of communication. I have always been skeptical of it and never understood how would anyone even figure that out or verify it. The paragraph you quoted gives me an anchor point and suggests it might be true.

I have a friend who was very nervous about speaking in English, but he dreamed fluently in it. We knew because he talked a lot, rather continuously, in his sleep. Extrapolating that I can imagine a deaf person dreaming verbal communication.


It's not a stretch to imagine this if you think of it not as "babies believe they can communicate with thoughts" but rather "babies haven't realized that they need to communicate". They simply neglect the step of ensuring that others know the same things as themselves. People who retain this flaw into adulthood are called programmers.


I have heard that babies do not realize that their thoughts are private and assume that thinking by itself is a form of communication.

I haven't heard this one before, but it wouldn't surprise me. Children up to age 7 or so can't appreciate the fact that other people's experiences are any different from theirs (egocentrism). This is why kids will do things like bury their faces and say "you can't see me!". They don't understand that just because they can't see you, you can still see them.

It's not a big stretch from there to them believing that if they hear their internal monologue, that other people hear it as well.


>This is why kids will do things like bury their faces and say "you can't see me!". They don't understand that just because they can't see you, you can still see them.

In the game of peekaboo (peepo, or whatever you call it) an adult pretends to be hidden by putting their hands over their eyes, they say "peekaboo" (or whatever) and reveal themselves as if appearing suddenly - this game is often played with very young (pre-vocal) children.

I wonder what the relationship is, whether the kids really feel internally they are hidden or if they mimic that which is presented to them? It seems straightforward, but perhaps it's an emperors new clothes thing - the kid thinks initially that the person isn't hidden (though their identity is) but comes to learn that they should think the person is hidden, that's what the game is.

Obviously it's complex, hiding your facial features from a myopic infant will make you appear no longer to be a person (or at least not the person they recognise, "where's Mummy gone all I see now is a blur", and then revealing those features is a surprise. One has effectively disappeared and reappeared just like an adult watching someone put on camouflage and disappear in to and return from undergrowth when that person didn't really leave your field of view.

>It's not a big stretch from there to them believing that if they hear their internal monologue, that other people hear it as well.

I think it's a huge leap.

Not that one shouldn't take it as an hypothesis, just that assuming hidden brain functions act on hidden sense data in a particular manner base on equivalence of senses seems like it would need a lot of effort to demonstrate.

This is the sort of leap people make in assuming thought in animals - "that dog smiled he must have a sense of humour an internal conscious function that responds to humorous situations just like I have".


> I have heard that babies do not realize that their thoughts are private and they assume that thinking by itself is a form of communication.

I've occasionally noticed other adults say odd things, followed a bit later by "oh crap, I said that out loud?", or more frequently people will "think out loud". I've also noticed once or twice (I think I might have to be fairly well distracted for this) that someone would say "hi" in passing and I'd say "hi" back... and then a few steps later realize I hadn't actually said anything, just thought it at them (rather like how on a sizable fraction of conference calls, there's at least one person who forgets to un-mute their phone when talking).

...also my wife sometimes says I'm supposed to be able to read her mind on a number of topics, but I think that's different.


I think the issue is similar with bilingual people. Since I started speaking English, my thoughts regularly switch from one language to another, especially when I can't remember the word I want in that language. It's also a great exercise during the learning process.

But what caught my attention in the article was the link between language and self-awareness. I think it makes sense: the more complex ideas you can convey to other people, the more complex ideas you can convey to yourself too.

Maybe that's what make us different from the other animals?


ever dreamt in English? until you reach that point, dreaming in a foreign language, you didn't really grasp the other language.


Dreaming in another language may be a sign you're still just a beginner, I don't know. I remember reading that when you have the Tetris Effect (which is awesome) it's because your brain is still trying to figure out all the rules of shape patterns, and that experts who play all the time don't have the Tetris Effect.


I disagree. In my experience, dreaming is more correlated with attempting to gain mastery than in having achieved mastery (in language and other things).

I think having one's inner thoughts in the new language, without having to mentally translate from another known language is a more relevant gauge of mastery.

Of course, there is also the level of mastery where a thought is more naturally expressed in the newer language, and you have to mentally translate it back to your native tongue...

kb


Now that you mentioned, I don't remember a single instance of a person speaking in my dreams. But I can't figure out if it was just a lack of language as general or if people communicated telepathically.

Either way... that's unsettling.


I've definitely had dreams with people speaking to me, but even in a language I'm learning -- the strange thing is that I can remember struggling to parse and comprehend what they were saying, as if their level of speaking ability was a bit higher than my own.


Meh. I've dreamed in French, and I'm not even close to being fluent.


I doubt that. For instance, I started dreaming in English way before becoming fluent.


I'm a native english speaker, and I don't dream in english, or any specific language for that matter. When I reconstruct the dream later, I put the characters' thoughts into words, but at the time, it's just telepathy.

Maybe I'm just weird.


There has been some very nice discussion on deafness here http://news.ycombinator.com/item?id=2111278 I picked up helpful tips from there.

I have limited hearing on one side and lost it fairly late (in my teens). So I do think in a verbal language, however I also picture/visualize things and not always speak to myself in words. This has a weird consequence that, if I am unexpectedly asked to explain the thought process sometime, I have to struggle for words. One perennial but minor difficulty is telling time, as sense of time is mostly visual to me, I do not tell myself the time. But I do not know if this idiosyncrasy is deafness related or not.


You don't have to be deaf to think in pictures. My oldest son thinks in pictures. He also has no sense of time and some other issues. I don't recall if these two things are specifically related. Temple Grandin thinks in pictures and wrote a book with basically that title ("Thinking in Pictures"). My son has to translate spoken words he hears into pictures to understand them and then translate pictures in his head into spoken words. This need for translation means he sometimes has the kinds of misunderstandings with people that you typically see from someone who speaks English as a second language. (And, in fact, he was a late talker -- and was basically pushed into it by mom enrolling him in preschool.)

These days, there is a fair amount of literature on things like lack of time sense and thinking in pictures. Here is what my son once said about living with no sense of time:

http://www.kidslikemine.com/b8timeblind.shtml


Can anyone confirm this last (supremely fascinating) quote at the end of the article?

" In sign languages such as BSL, you’d use your hand gestures, facial expressions, etc. to express things like that the walk was on a dirt road; it was nice out; and you enjoyed the walk, with it all expressed simultaneously. This non-sequential nature of sign languages allows for faster and more detailed communication, but has the drawback of being ridiculously hard to put into print, though attempts have been made. "


Confirmed. Adding such sensory detail to narrative descriptions is pretty easy to do in American Sign Language (I can't speak for BSL or others, though French Sign Language is pretty close to ASL, from what I understand).

I'm no longer good enough with ASL that I can do all that simultaneously (I'm a native speaker as a result of 8 years at a school for the deaf, but use spoken/written English all but one or two days out of the year).


The failure of print to convey the richness of language is not unique to sign - we're just used to the limitations.

Consider the difficulty people have with expressing rich communication via email compared with vocal conversation, and again compared to face-to-face communication.

It is fascinating that additional factual details can be conveyed in sign, but it is inaccurate to believe that print can encapsulate the richness of direct personal communication.

kb


This is very similar to what Socialogist George Herbert Mead proposed in his grand theory of "Social Behaviorism" (1934):

   "Mead the social psychologist argued the antipositivistic view 
   that the individual is a product of society, or more specifically, 
   social interaction. The self arises when the individual becomes an
    object to themselves. Mead argued that we are objects first to 
   other people, and secondarily we become objects to ourselves by 
   taking the perspective of other people. Language enables us to 
   talk about ourselves in the same way as we talk about other people, 
   and thus through language we become other to ourselves.[16] In joint 
   activity, which Mead called 'social acts', humans learn to see themselves 
   from the standpoint of their co-actors. A central mechanism within 
   the social act, which enables perspective taking, is position 
   exchange. People within a social act often alternate social 
   positions (e.g., giving/receiving, asking/helping, winning/losing,    
   hiding/seeking, talking/listening). In children's games there is 
   repeated position exchange, for example in hide-and-seek, and Mead 
   argued that this is one of the main ways that perspective taking develops." 
(http://en.wikipedia.org/wiki/George_Herbert_Mead)

He proposed that organisms first experience or see themselves as an object (or perceive themselves at all, thus stepping over the instinct-border) when they use their voice and hear their voice at the same time and see that others react to that voice too. Even though it is very theoretical it makes for good and interesting reading.


A must read--both the article AND discussion here.

This article and the discussions below are truly one of the more fascinating topics to be had.

It is always great to read amazing topics submitted to HN (which is a large portion of the point here, right?) and to read comments for better understanding and discussion (which is the other potion here), but to have both an intriguing article matched with equally thought-provoking and explorative discussion is a complete intellectual gem.


There's a thought here that part of our fundamental intelligence is cultural, and without that language, our brains are seriously incomplete.

What is even more amazing (and somewhat relieving in its robustness) is that human communities develop language spontaneously. There's the famous case of a school of deaf children in Nicarangua, who, lacking teachers, spontaneously developed their own language. Apparently, with complex grammar and so on. http://en.wikipedia.org/wiki/Nicaraguan_Sign_Language I find this profoundly hopeful, in what it shows about people. Also http://blogs.discovermagazine.com/notrocketscience/2010/06/2... (via sp332's link)

Even in this article, there is mentioned "homesign" or "kitchensign", though I imagine it is very limited, and would tend to lack abstractions, complex grammar and so on, being used by a small (ie family) group, for domestic purposes, and while the child is very young.


This is fascinating. Of particular note to me is the conception that sign language operates in parallel, rather than in sequential. Written language is sequential.

In the computer domain, parallelism is, in my experience, best expressed in circuit diagrams and sequence diagrams. This is in the picto-ideagraphical realm, which is the realm of sign languages.

Of course visual programming languages have a long and distinguished history of living in a niche. But I wonder if there is a radically awesome visual programming language expression out there that is sufficient.


Great book if you're interested in further reading

http://www.amazon.com/Talking-Hands-Language-Reveals-About/d...


There was an excellent episode of Radiolab about this subject. If you have the time or inclination to listen to an hour long podcast, I highly recommend "Words": http://www.radiolab.org/2010/aug/09/


Similar discussion from eight months ago here:

http://news.ycombinator.com/item?id=1505365

Extremely interesting subject, extremely interesting discussion, with many great insights.


>This non-sequential nature of sign languages allows for faster and more detailed communication, but has the drawback of being ridiculously hard to put into print, though attempts have been made.

I sense opportunity...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: