How to make blind and deaf
How to make blind and deaf
Insider
Insider
Design for the Edges: Video for the Blind and Deaf
Highlights:
I do a lot of writing and speaking on issues of diversity, particularly when it comes to the workplace in the Screen Industry. I am someone who looks for what’s missing—I seek out and lift up those who often get left behind. Recently I realized that while diversity and inclusion initiatives seem to be increasingly popular both on and off-screen, we’re still overlooking one large sector of our community. We’re missing people with disabilities.
In thinking about this, I realized I had a few questions to answer. I set out to explore them within my local Screen Production community. I wanted to know:
As an able-bodied person, I didn’t want to write a piece speaking on behalf of anyone. Instead, I decided to share my journey of answering these questions, and a few of the many things I learned on the way.
According to a 2010 OECD study, globally, there are more than 1 billion people with some form of disability—that is about 15% of the world’s population (or one in seven people).
We as people without disabilities often see the physical barriers those with disabilities face—but we often underestimate the additional barriers faced due to social misconceptions and attitudes.
With so many people around me living with disabilities, my first and most logical step toward learning more would be to ask a friend or colleague.
A block from the post-production house where I work, is a production company called Attitude, where they specialize in creating accessible content about people living with disabilities. One-third of the staff there are disabled themselves, so it seemed like an obvious first place to go.
I knew Dan Buckingham and Jai Waite already; we generally hang out together at industry events. Jai and I can talk endlessly about technology and hot sauce. I brought donuts, knowing that Dan had just participated in the New York Marathon. Dan is tall, handsome and well-spoken. He’s played rugby for most of his life–first as an able-bodied athlete through college, then as a wheelchair athlete, winning a Gold and a Silver medal at the Paralympics. He started his television career as a researcher, eventually becoming a Post Supervisor, Producer, and is now General Manager of Attitude Live.
Producer Dan Buckingham
We started our conversation by talking about some of the things Attitude did to make their own content more accessible to a wide range of people. Dan told me that while they knew they would have to consider ways to make content for people who are blind and deaf, it was working with and creating content for people with intellectual disabilities that surprised him the most.
“Even though I’d worked in the sector for many years, I still had some preconceived ideas,” he told me, referring to when they first launched their video content on the web. During user testing, he noticed how easy it is for people with intellectual disabilities to navigate online, where they’re given the time to work things out. “We did things like make sections color-coded so people could navigate easier, used clear concise wording, and incorporated lots of iconography.”
He also noted that having lots of white space, consistency across pages, and color contrast also helped people with visual impairments better navigate the site, whether via a screen-reading device, or with partial vision.
Making Videos for the Blind
An estimated 4% of the U.S. population identifies as blind. Dan told me that blind audiences are the most overlooked by producers when it comes to accessibility, but that in fact, blind audiences are watching many of the same shows as I am. They do this through Audio Descriptions added to the soundtrack.
Across the table from me was Jai Waite. Like most editors I know, he is as nerdy as he is artistic. Right at home in a small cutting room, he’s often quiet until the conversation turns to something he can geek out about. Jai said that making content better for blind audiences is something he does in the edit.
“I like listening to audio documentaries to keep myself in that space of being aware that sound is so important. Blind people really like lots of natural sounds in a scene to help paint the picture.”
He encourages production crews to capture as much good, clean location sound as possible. He then consciously leaves spaces in the edit where he can, to allow for audio descriptions. “In the edit, it’s a balance. You want to try and leave more space, but you also don’t want to add dead-air to too much space. But you can leave space in sequences, like musical interludes—just do what you can to make it accessible and enjoyable to all.”
The guys told me that it’s worth trying to watch something with the audio description on. It’s amazing how creative it can be and how much it can add to the story without being obtrusive. It struck me what a creative process it must be to write and produce these elements, so I went to see how it’s done.
Creating Audio Descriptions
Netflix and YouTube are providing a growing number of videos with Audio Descriptions. Here in New Zealand, where I currently live and work, Able is a non-profit organization providing subtitles and audio description services for content on local television. I met James Kupa, whose smooth, enunciated radio voice and bright disposition makes it clear that he is an Audio Describer, and he loves his job.
James and his colleagues describe a wide range of content for blind audiences, from live events, to crime procedurals, even The Simpsons. The day I met him, James was in the middle of describing a BBC Drama Mini-Series.
“We get the media, which is this here,” he says, pointing to his monitor which has an NLE timeline of the show itself in a single part. “This is what goes to air. We find breaks in the dialogue, and we describe what’s happening in the scene or what’s pertinent to someone who might not be able to see what’s on screen. So if you didn’t watch this,” he continues, hitting play, “you wouldn’t see any of this.”
I watch a long shot of a man driving his car down a deserted road. There’s no dialogue, but in true British Drama style, a lot is going on. James goes back to the start of the shot, and narrates the scene in a way that feels like he’s reading the story as a novel.
“Rosie and Jack are riding their bikes down a narrow road. David passes them in his car. He pulls over, and gets out.”
After stopping the recording, I turned to James and said “this sounds like the shooting script”. On his second monitor, I see James has a script, that he’s constantly writing, amending and annotating. I ask him if he ever uses the stage direction or script to make his job easier. He tells me he gets the scripts from Production, but that his script is quite different. Not only does he literally have to describe what’s on screen, but he needs to avoid describing too much of the character’s motivation or feelings. “Everybody experiences watching a show differently. So you want people to be able to make up their own minds about who’s guilty and who they should be sympathetic with.”
I watch as James continues his work. It’s interesting how much he holds back. There are places where he has time to describe, but chooses not to. He wants to ensure the blind audience gets to experience the same emotional beats, pauses and tension as the director intended. He tells me that there are rules he follows, too. He doesn’t use a character’s name until their name is used. Instead, he describes them using physical traits. He doesn’t want to give away anything in the story; but sometimes, he says, there’s a look in a character’s eye that wants to tell the audience “that dude looks shady”.
Before taking up too much of his work day, I wanted to ask him one more question.
“What is the hardest thing to describe?”
“Comedy,” he says immediately. Able has audio described The Simpsons, which is dialogue-heavy, full of visual gags, and of course, is weird. Describing the opening “couch gag” alone was an enormous creative challenge for their writers and describers, and episode 1 took two hours to write the audio descriptions script.
I wanted to know more, so I went to talk to Wendy Youens, Able’s CEO. She outlined the creative process that goes into describing, as efficiently as the descriptions themselves.
“The creative process comes down to what are the key elements that are important to the story in this visual shot, what’s just happened, what’s happened since the last description, what’s going to happen before the next one, and what do people really need to know?”
She explains that the describers are constantly making judgment calls that affect how much an audience can enjoy the content. They must determine what is the most important thing to describe, in the small gap between dialogue. I asked her, “If a picture paints a thousand words, how do you fit a thousand words into four seconds?”
“You just can’t,” she replied. “There’s only so much you can describe.”
Like Jai Waite, Wendy and James also told me that content creators need to leave gaps for audio descriptions. I’m reminded that adding accessibility features is good for everyone, especially when it enables a wider audience to enjoy the work I’ve created.
YouDescribe is a great online tool that crowdsources audio descriptions, allowing volunteer sighted describers to take a YouTube video and easily create an audio description soundtrack for it so that blind audiences can enjoy the same viral videos, movie trailers and cultural content.
Just down the hallway from the audio describers at Able, men and women were training machines to help them create captions for the 80% of deaf and hearing audiences who for various reasons, enhance their media consumption with subtitles.
Audio for the Deaf is Video for Everyone
Subtitles, captions, timed text—we’re all familiar in some way with these terms, especially as they are becoming more common than ever in the content we create and consume. In 2014, captions became part of iOS. They’re now automatically part of most online social content, and part of the standard deliverables to streaming services such as Netflix, Hulu, Amazon and BBC iPlayer. There are laws in most countries that require certain film and television content to be captioned, and these days it’s considered best practice to do so.
Official statistics suggest an estimated 4% of the U.S. population is deaf—the same proportion as those who are blind. This encompasses those whose hearing is impaired enough that they have trouble listening to normal conversation, even with a hearing aid.
Captions are not only used by deaf and hearing-impaired audiences. They are also popular with people who are trying to learn or improve their skills in a second language, or those seeking to consume content from other cultures. They’re also becoming ubiquitous to those of us who watch videos on mobile devices—85% of videos on Facebook are viewed without sound. Using captions allows for second-screen or less disruptive viewing in public environments.
YouTube allows the uploader to select automatic captions, created using speech to text AI, or to upload captions created manually. Amara crowdsources captioning and translation for video content, making it more accessible and affordable for productions. Subtitle Edit is a free online tool for creating caption files, and of course Avid has Subcap as part of the NLE.
Able has a large captioning department, where captioners specifically work to create subtitles for a wide range of content and platforms. Wendy told me that the most important things to consider when creating captions are readability, accuracy, and placement on the screen. Make sure they aren’t too fast, that you’ve checked the spelling and grammar, and that the captions aren’t obscuring important action on the screen. She suggests you ask your viewers for feedback and follow up on it where you can.
Captioning has been around in broadcast content since the 1980’s. Recently, technology has changed how it’s done, enabling audiences to watch much more with subtitles. Wendy explained to me that there’s some really interesting technology augmenting people’s abilities, especially for the deaf.
Hearing Loops are an older technology, yet still very popular. In a movie theater, for example, the Hearing Loop allows a person to connect their hearing aid device directly to the audio output feed. Also in participating cinemas, there are screens that plug into the seat, showing captions in sync with the movie. There are apps that use voice recognition to automatically caption on your phone what is being spoken on the movie screen; but both the app and the screen, Wendy noted, are not ideal. “It’s not great, having to have your phone screen on in a cinema. It’s disruptive and not exactly comfortable to have to look at both screens at once.”
We walked down the hall, past rows of captioners typing text and watching videos, and I asked how much of their work is done using AI. I was surprised when she told me that they use speech-to-text and Machine Learning in an unexpected way.
In a corner of a room was a woman wearing headphones and speaking to a computer as though it were alive, but just learning English. A cup of tea sat on the desk. I felt witness to a great deal of patience. “That’s one of our re-speakers, teaching the computer,” Wendy said.
We stepped out of the room, and Wendy explained that they work around the errors and limitations of speech-recognition AI by using a small team of individuals, called re-speakers. These re-speakers work with a single copy of a speech recognition software called Dragon, taking time every day to train it to specifically and perfectly learn their individual patterns of speech. Then, when a live production goes to air, every time a person speaks, the re-speaker repeats it to the computer. The software is so well-trained on that one person’s speech, it automatically captions without errors. The re-speakers also say all the necessary punctuation, getting mostly-accurate text on screen with minimal delay. Wendy explained that this used to be done by specialist typists, and audiences had to accept that it would be relatively slow, and prone to errors.
As we strolled back through the office, I paused at a bookshelf across one wall. “Are these dictionaries?” I asked. Wendy smiled. “Yes, those are a blast from the past. We use the Oxford Online dictionary for all our spelling. But we like to keep the old library from before.” The bookshelf contained more dictionaries than I knew existed. There were books of slang, culinary terms, and thesauruses. The collection even contained different editions of favorite volumes, for what I imagined were simply nostalgic purposes known to librarians, linguists, and captioners.
As I was leaving, I asked Wendy whether any of their staff were blind or deaf. “My job is the only one here that doesn’t require a person to have good eyesight and hearing,” she explained. “That’s the thing about diverse hiring though. You have to consider what physical attributes really are required. Most desk jobs, especially, don’t require a person to be able-bodied.”
Wendy helped me realize that making content more accessible is easier than I thought. Aside from her practical solutions, she urged me to think about simply being a better ally to people with disabilities, using whatever influence I have to bring positive change. “The best thing you can do is influence those you’re dealing with to consider accessibility. Talk to your distribution chain. Ask yourself where is your film going, and how will it be made accessible throughout its life cycle? Ask whether screenings of your film will be accessible, and if not, why not? Do what you can to raise awareness about the importance of accessibility for everyone.”
One thing we can all easily do is consider how each production we work on can be made more accessible in some way. (See the addendum at the end of the article for ways of creating captions for the hearing impaired).
Dan Buckingham, as a producer, also acknowledged that even when the production wants to make their content as accessible as possible, sometimes you’re on a deadline and you have to make a compromise. In terms of seeing more people with disabilities in the office, I haven’t stopped thinking about how much value would be added to the team by someone who approaches everything in life from a different perspective, and who is by necessity an expert problem-solver. The changes we need to make to have more inclusive workplaces are so minimal, and would benefit the whole team. Dan told me the best thing we can do is “design for the edges”, that when you make something more accessible for someone, you make it better for everyone.
Addendum
If our content is going on YouTube, we can use the automatic captions, which are created by speech-to-text AI, but I’m told we should always review and correct them because they can be quite error-prone. You can edit the automatic captions by going to video manager, click Edit > Subtitles and CC. In this menu, you can also click Add New subtitles or CC, and either upload or manually type in your own.
YouTube Caption Editor
In Vimeo, when uploading a video, go to the Advanced tab in Settings, select choose file and upload your SRT, WebVTT, DFXP/TTML, SCC, or SAMI file.
Deafness, Blindness and Deafblindness FAQ
Everything you need to know about deafness, blindness, and deafblindness, from a day-to-day perspective
Search the blog to see if your question has been answered here
Send any unanswered questions to deafblindness at gmail dot com
6/29/10
How the Deaf/Blind Think and Dream
When I think to myself, I have a running dialogue in my head. If someone has never heard a voice before, how do they think?
How a deaf person thinks depends on what kind of exposure they’ve had to language and life in general. A deaf person who has no hearing at all or who has never heard speech in any way doesn’t know what voices or speech sounds like. Someone who has some hearing and can hear parts of speech does know what voices and speech sound like, but their idea of speech and voice is different from a typical hearing person’s.
For deaf people who have never heard speech, they think in the same language that they use. A deaf person who uses sign language will think in sign language, and will «see» the signs in their mind’s eye the same way hearing people «hear» a voice in their mind’s ear. Similarly, a deaf-blind person who uses tactile sign language «feels» the signs in their hands in the same way a deaf person «sees» the signs in their mind. A deaf person whose main exposure to language is in its written form might visualize their thoughts like print on paper or handwriting on paper. Many deaf people also say they think in images or just concepts in general, independent of language. Of course the details of how a particular individual thinks varies, but deaf people can and do think independent of sound (and language when they’ve had little or no exposure to language).
When I think of a word, I imagine it written on paper in my head. How does a blind person visualize the spelling of a word?
Depending on the writing format they use the most, a blind person might also visualize a word or its spelling in print. If the blind person uses braille too, they can «visualize» the braille in their heads, in the same way as print. Some blind people «feel» the braille under their fingers in the same way a sighted person «sees» a word in their mind.
There’s a lot of variety in how people think so the ways mentioned above aren’t necessarily the only ways people think.
How do deaf, blind, and deaf-blind people dream?
People’s dreams generally reflect their reality. Whatever extent of hearing loss or vision loss a person has in real life will generally experience the same in dreams. Whether or not a person was born blind, deaf or went blind or deaf a long time ago also makes a difference. Someone who is born totally blind cannot visualize images. Someone born totally deaf cannot conceive of sound.
Both totally blind and deaf people can conceive of sound and images in different ways though. Some deaf people have visual associations with sound (such as moving lips) and also an association with vibrations. Blind people can «visualize» through their sense of touch, where their brain forms an «image» from tactile information. A blind, deaf, or deaf-blind person may also experience smells and tastes in their dreams more often.
People with progressive vision or hearing loss may either still be able to hear or see in their dreams, or their dreams may reflect their progressive loss with a delay in time.
Sometimes deaf or blind people mention having a sense of «knowing» where things are in their dreams even if they may not know where things are in real life. Some deaf people also mention a sense of telepathic communication or communication free of language in their dreams. Some deaf people also mention that their dreams come with automatic «closed captions» for all the dialogue in their dreams. Deaf people who use sign language often experience the people in their dreams signing, regardless of if the person knows sign language in real life or not.
Because of the variety in how people think there might be other ways that deaf, blind, and deaf-blind people dream that haven’t been mentioned here.
Deafness, Blindness and Deafblindness FAQ
Everything you need to know about deafness, blindness, and deafblindness, from a day-to-day perspective
Search the blog to see if your question has been answered here
Send any unanswered questions to deafblindness at gmail dot com
How do totally blind people find braille signs?
I know that signs for bathrooms, exits, elevators, etc., have the braille under the regular writing, but how does a blind person find the braille in the first place?! Especially when they don’t have someone with them. Then you wouldn’t need the braille anyways, just ask the person what it says. That always confused me.
They’re always in the same place, either on the door or right next to the door at shoulder-height. Blind people learn to sweep the door and the wall with their hand to find the sign and then the braille on the sign.
1/16/11
Inner Sound, Inner Sight
Inner Sound, Inner Sight
Anonymous
If you do a simple Internet search of the terms «dark and silent» or «a world without sight or sound,» you’ll find many results about deaf-blindness. The combination of deafness and blindness is often referred to as being in a world of darkness and silence. But there is absolutely nothing dark or silent about deaf-blindness. If you have ever spent a few days in complete silence, or closed your eyes even for a moment, you will find that you hear sounds and see things that aren’t there. You may even be initially folded, thinking what you’re hearing or seeing is real. But if you try to confirm with another person, you will soon realize that those sounds are all in your head. That is deafness and blindness. If you can’t see or hear your environment, your brain fills in the gaps for you. I hear phones ringing; I see people out of the corner of my eye; I hear people call my name; I see colors flashing in my field of vision. But based on my medical vision and hearing loss, I know those sounds can’t be real. They’re out of the range of my sight and hearing. All I can see is light. I all I can hear is very loud sounds near to me. I’m as close as you can get to totally deaf and blind without being totally deaf and blind. I’m one step away from «total» in both cases. In fact most doctors would consider me total because I’m so close to it anyway. Apart from knowing when the light is on, I can see nothing. I feel vibrations before I hear the sound, and sometimes my brain converts vibrations I feel into sounds, so it’s hard sometimes to know if I heard something or felt it and think I heard it. Yes, I can hear a train passing me by, but I feel its weight under my feet and I feel the vibrations from the sound before I hear them.
After your vision and hearing is gone for long enough, your brain sort of translates «vision» and «hearing» to your hands and body. I use the terms «vision» and «hearing» loosely here. I’m not talking about the image falling on your retina or the sound waves reaching your eardrums, but rather, the images and sounds your brain construct from your fingertips and body. I feel the braille characters of this essay I’m writing under my fingertips and «see» them in my head. I feel the music playing ever so slightly in the table when I rest my hand on it. If the music is loud enough, I feel it resonating in my ribcage, in my hair, and even in the area my ears. Though my ears themselves don’t work, the shape and structure of the ears evolved to be conducive to vibrations so I can feel the sounds resonating in my skull and the cartilage in my ears, even though I don’t hear them. Sometimes I can even feel the change in air pressure if a sound is loud and booming.
The world of deaf-blindness is far from a dead one. The world is constantly full of vibrations and smells and changes in temperature and air pressure. Many people seem to believe that without ears and eyes, the world becomes unmoving and still, but this couldn’t be farther from the truth. The world is very much alive. And that’s not even considering all the non-physical aspects to life such as relationships, fortunes and misfortunes, and the other non-material aspects of life. The world around me is very much alive. In fact I am as easily startled by a powerful vibration as I used to be by a loud sound. A strong smell can be as distracting as a flashing light in one’s peripheral vision.
People’s hands take on personalities. Just like a man may have a sharp-looking face, a man can also have sharp and pointed hands. Someone with a stiff facial expression carries that stiffness in their hands as well. A soft and gentle person has hands that are soft and gentle too. To communicate, I use sign language. Rather than «listening» with my eyes as the sighted deaf do, I «listen» by putting my hands lightly over the speaker’s hands. Just as a woman’s words can be articulate in speech, s to can someone’s hands. They form the hand shapes clearly and distinctly. Just as a drunk person mumbles in speech; a drunk signer mumbles through their hands as well. Some people have a «loud voice»—their hands flying in the air taking up a large space while others are «soft-spoken»—signing in a small space and only changing their hand shapes as minimally as possible. Since I have to track people’s hands, I prefer whisperers to shouters. (My shoulders get quite sore if I have to follow someone who signs big.) And just as you might remember the boom of your former teacher’s voice or the high pitched voice of your sister, I recognize the different styles of people’s signing. Likewise, the same person can speak gently into my hands by making their signs tender and slow. Someone can also yell into my hand by practically slapping my hand around and signing violently.
Just as some people wear bright colors and others prefer earthy colors, some people wear strong perfumes or use powerfully scented lotions or soaps, while others have a more subtle scent to them or have a natural musk to them. Some people choose scents that blend well with their body scent while others choose ones that clash, just as some people dress tastefully for their body shape while others wear whatever’s in style without regards to how it appears on their particular body type.
The world of deaf-blindness if far from the barren image may people seem to have of it. It’s a different world but it is nevertheless a world. I see through my hands and I hear through my body. My world is far from dark and silent. It is a world of inner sound and inner sight.
Understanding Assistive Technology: How Do Deaf-Blind People Use Technology?
Understanding digital accessibility challenges is easy if you know people with disabilities. But what if you’ve never seen a person who is blind use their computer or smartphone? We’re here to help you understand a little bit about what it’s like to use the internet if you have a disability. Check out our other articles in this series:
What do we mean by “deaf-blind?”
A person is considered deaf-blind if they have some combination of hearing loss and vision loss. While there are some people who are totally deaf and totally blind, the majority of deaf-blind people have some amount of residual hearing and/or vision. Some will have more vision than hearing; others will have more hearing than vision.
It is important to note that if you see someone using a white cane or a guide dog, that does not mean they are totally blind. It means they have sufficient vision loss that having a dog or cane makes it easier and safer for them to navigate independently.
Assistive technology used by deaf-blind people
Assistive technology (AT) is a broad term that refers to hardware and software that enable people with disabilities to access technology. Those who are deaf-blind use a combination of AT for blind users and AT for deaf users, depending on their individual needs.
AT for visual disabilities
For more detailed information about assistive technology used by people with visual disabilities, check out our previous two articles about users who are blind and users who have low vision.
AT for hearing disabilities
For more detailed information about assistive technology used by people who are d/Deaf or hard of hearing, check out our previous article.
The importance of braille for deaf-blind users
For some deaf-blind people, braille is the only way they can read. For others, braille is the fastest way to read.
Documents and transcripts
If documents and transcripts are formatted with accessibility in mind, they can be translated by braille devices for a person who is deaf-blind.
Phone communication
Each state has a program for people who are deaf or deaf-blind to make and receive phone calls. In Level Access’s home state of Virginia, the program is is Virginia Relay and it is available 24/7.
A Virginia resident who is deaf-blind can make phone calls using TTY (text telephone)-to-braille with the help of a Virginia Relay Communication Assistant (CA). The CA serves as the interpreter between the two parties and types out the messages so the deaf-blind person can read them using their braille device. Virginia Relay’s services are also available in Spanish for in-state calls.
Remote Conference Captioning is also available for deaf-blind users who participate in conference calls for work.
Free equipment for deaf-blind people
The federal program iCanConnect offers free technology and training for those who are deaf-blind. iCanConnect is the easier-to-remember name of the National Deaf-Blind Equipment Distribution Program (NDBEDP—say that three times fast!), which was established by the FCC as part of the CVAA.
The mission of this program is to give deaf-blind people access to the technology they need to stay connected with the world. This includes braille devices, computers, tablets, smartphones, vibrating alert devices, accessories, and software.
Accessibility barriers for deaf-blind users
Here are some accessibility issues that restrict access to people who are deaf-blind. Since each deaf-blind person is a different amount of deaf and blind, accessibility issues can come from both sides!
Accessibility issues for visual disabilities
Accessibility issues for hearing disabilities
The bottom line: Design to include deaf-blind people
You can design your websites, software, and hardware with people with disabilities in mind and you can retrofit existing technology to be accessible. It’s a win-win situation for your organization (more clients, more revenue, more contracts) and people with disabilities (less confusion, less frustration, less isolation). Some fixes, like ensuring your video and audio files have accessible transcripts, are quick to implement and make a big impact on the user experience.
How to make blind and deaf
Image: The author at the 2004 Seabeck convention in the State of Washington with his friend Willie Wilkinson and others.
The iris, unfolds like a flower. The pupil (the hole formed by the iris) widens or shuts off depending on the light the body wants to receive. The cornea is a transparent layer over the iris. The lens focuses the light on the retina in the back of the eye. The macula is an area with a dense population of sensors. The optic nerves send a signal from the retina to the brain.
TYPES OF DEAF-BLINDNESS
You don’t need to memorize these types, but understanding them will help you voice for D-B people when they discuss their Deaf-Blindness and understand how and why you need to modify your interpreting depending on their needs.
Sometimes misspelled «Usher’s syndrome». Usher syndrome is the most common condition that affects both hearing and vision. A syndrome is a disease or disorder that has more than one feature or symptom. The major symptoms of Usher syndrome are hearing loss and an eye disorder called retinitis pigmentosa, or RP. RP causes night-blindness and a loss of peripheral vision (side vision) through the progressive degeneration of the retina. |
1 affected, 1 unaffected & two carriers in the children»> Image: Usher Syndrome. | The retina is a light-sensitive tissue at the back of the eye and is crucial for vision. As RP progresses, the field of vision narrows, a condition known as tunnel vision until only central vision (the ability to see straight ahead) remains. Many people with Usher syndrome also have severe balance problems. There are three clinical types of Usher syndrome: type 1, type 2, and type 3. In the United States, types 1 and 2 are the most common types. Together, they account for approximately 90 to 95 percent of all cases of children who have Usher syndrome. Macular degeneration is a medical condition predominantly found in elderly adults in which the center of the inner lining of the eye, known as the macula area of the retina, suffers thinning, atrophy, and in some cases, bleeding. This can result in loss of central vision, which entails inability to see fine details, to read, or to recognize faces. According to the American Academy of Ophthalmology, it is the leading cause of central vision loss (blindness) in the United States today for those over the age of fifty years. A cataract is an opacity that develops in the crystalline lens of the eye or in its envelope. Cataract derives from the Latin cataracta meaning «waterfall» (you can’t see through white water). Cataracts develop from a variety of reasons, such as the secondary effects of diseases like diabetes, hypertension and advanced age. They are usually a result of denaturation of lens proteins, similar to a clear eye white clouding after cooking. Although raised intraocular pressure is a significant risk factor for developing glaucoma, there is no set threshold for intraocular pressure that causes glaucoma. One person may develop nerve damage at a relatively low pressure, while another person may have high eye pressure for years and yet never develop damage. Untreated glaucoma leads to permanent damage of the optic nerve and resultant visual field loss, which can progress to blindness. Over time, diabetes affects the circulatory system of the retina. The earliest phase of the disease is known as background diabetic retinopathy. In this phase, the arteries in the retina become weakened and leak, forming small, dot-like hemorrhages. These leaking vessels often lead to swelling or edema in the retina and decreased vision. The next stage is known as proliferative diabetic retinopathy. In this stage, circulation problems cause areas of the retina to become oxygen-deprived or ischemic. New, fragile, vessels develop as the circulatory system attempts to maintain adequate oxygen levels within the retina. This is called neovascularization. Unfortunately, these delicate vessels hemorrhage easily. Blood may leak into the retina and vitreous, causing spots or floaters, along with decreased vision. In the later phases of the disease, continued abnormal vessel growth and scar tissue may cause serious problems such as retinal detachment and glaucoma. Hemianopia (or Hemianopsia) Hemianopia is a blindness or reduction in vision in one half of the visual field (hemi as in hemisphere + opia as in myopia) due to damage of the optic pathways in the brain. This damage can result from acquired brain injuries caused by stroke, tumor or trauma. Stroke occurs when there is damage to a group of nerve cells in the brain often due to interrupted blood flow, caused by a blood clot or blood vessel leaking. Depending on the area of the brain that is damaged, a stroke can result in coma, paralysis of one side of the body, speech and vision problems, and dementia. The vision may be gone in the right half, left half, upper half, lower half, or the outer half (periphery), and in one eye or both.
Behaviors that may indicate a visual impairment Because most Deaf-Blind people have Usher Syndrome and the symptoms only become obvious around the times that they are adolescents, you may well interpret for people that does not realize or are only coming to realize that they are deaf-blind. For this reason it is good to know behaviors that indicate a person has a visual impairment. One semester, the first day I interpreted for someone in an on-going assignment, I was already seated house right and he went over and sat on house left. I asked him if he wanted me to move and he told me to stay where I was. This told me that he probably had a scotoma (blind spot). Does the consumer bump into objects? Move hesitantly or walk close to the wall? Searches for objects or touches them in an uncertain way? Tilts his/her head to see? Requests additional or different kinds of lighting? Holds books or other visual material close to the face? Drops objects or knocks them over? Shows difficulty making out faces or the numbers that designate rooms or floors? Act confused or disoriented; for example, walks into the wrong room by mistake? Deaf-Blind, Deaf/blind, Deaf-Blind? In 1991, Salvatore Lagati in Italy began a crusade to get international acceptance of the single word «Deaf-Blind» in place of the hyphenated word «deaf-blind.» The single word would indicate a different, unique condition and that impact of dual losses is multiplicative rather than additive. This proposal faces an uncertain future in the United States. Terminology has been a hotly debated issue for some time in the United States. Political correctness also seems to have greater influence in the US than in many other countries. Recent synonyms have included «dual sensory impaired,» «auditorally and visually challenged,» person «with deaf-blindness,» etc. Editorial policy for Deaf-Blind Perspectives (Reiman, 1993) requires the use of the language «person who is deaf-blind.» This usage seems to have general acceptance in the U.S. Perhaps, if Salvatore Lagati keeps up his crusade, «person who is Deaf-Blind» will have global acceptance in the future. The term «Deaf-Blind» implies a deaf-blind person who is culturally integrated into the D-B community, similar to the distinction between «deaf person» and «Deaf person». I will use the capitalized form. People with Usher Syndrome often prefer to be called USHER-SYNDROME PERSON ([U], PO away, FO up touches DS temple, then [S], PO away, FO up, touches the DS of the chin) as opposed to DEAF BLIND. MEETING AND NEGOTIATING NEEDS Image: Leader Dogs for the Blind: Deaf Blind Program. | Because Deaf-Blind people have lessened hearing and sight, they depend more on their other senses, which includes smell. One should try to avoid either too much natural body odor or too much artificial odors. This means one should bathe well, take care of oral and body odor, and avoid strongly perfumed products. One should also wash one’s hands frequently to avoid passing one’s own germs on to Deaf-Blind people, or from one Deaf-Blind person to another. When you first meet, touch the person’s shoulder while you are standing in front of them to let them know you are there, if they are standing. If they are seated, touch the back of the hand, and when they raise it, slide your hand underneath theirs. If they are speaking with someone when you arrive, touch their shoulder from the front and wait until they stop their conversation. Some people keep their hand on the shoulder as a «reminder». Don’t tap from behind, since they won’t expect to turn around. Be flexible with your communication method, your client may be new to Deaf-Blindness or have special needs. We will discuss communication options in a moment. State your name, name sign, and business first, then chat. Even a casual conversation (not an interpreted event) may lead to a need for guiding to the bathroom or to find someone so we will explain that procedure soon. |
For an intensive introduction to all aspects of Deaf-Blind work, see Smith, T. B. (October 1994). Guidelines: Practical tips for working & socializing with Deaf-Blind people. Linstok Press. ISBN: 1881133060.
ACTIVITY 2 : Although we have not discussed how to guide a Deaf-Blind person yet, I want you to experience what it is like being guided by someone who doesn’t know how. Hopefully they will be kinder than the one in the ad on the right. Find a partner and the two of you will take turns being the guide and being the Deaf-Blind consumer. Find an environment where you will need to pass through doors, take stairs, and walk on carpeted and uncarpeted floors if possible. After you both are done, discuss what your likes and dislikes were as a guide and as a consumer. The easiest way to make a blindfold, if you don’t have one, is to take a sock and safety pin it around your eyes. |
Notice if consumers seem to move their head as if avoiding blind spots in their visual field. You may need to adjust where you are sitting or change your signing space. Clothing contrast is still important: Dark green, brown, or grey is best for light complexions and light shades of those colors for dark complexions. Many Deaf-Blind people are even more sensitive to bright colors than a sighted Deaf person. Begin slowly with a new person until they are used to you and you see how best to communicate. The lighting you are in should be bright without glare. Don’t face the Deaf-Blind person into the sun or major light source. If you are using your voice, hard-of-hearing Deaf-Blind people may need you at their better ear and to sit away from noise.
Close visual (CV)
Determine the best distance to be. If you see consumers moving their heads, you may be signing too large. Be aware that although the consumers may see you, they may not sees other visual information (the board, PowerPoint, who is speaking). Vision in the same person may change depend on health, sleep, and emotions. What people see one day, they may miss the next day.
Tracking & tactile signing
With tracking, the Deaf-Blind person will hold onto your wrists, partly to limit your arm movement and partly to reinforce how the hands are moving. Two short squeezes (or pats on the hand) usually means, «Yes, I understand what you are saying.» One long squeeze usually means, «I didn’t get that.» It could mean, «Oh my goodness!» Judge by facial expression.
If there is pause, you can put your hands down, but it is best to maintain contact by putting your hands and the client’s onto your knee or lap. Negation with only the head is hard to see, add NOT, #NO, NONE, etc. Questions may need a QM (question mark) or D-I-D. If the person uses both hands to receive ASL, it’s best to sit facing with knees interlaced. If one-handed, 45-90 degrees is best. Sign with energy and clarity, despite feeling restricted, but don’t be wild.
Don’t duck your head to make signs like MOTHER, since it will obscure the difference between signs that differ only in location, such as MOTHER, FATHER, and FINE. Hunching also makes your signing space smaller. Information about affect that is usually conveyed by facial expression should be added by additional signs, such as FOR
SURE, DOUBTFUL, QUESTION-MARK, and so on.
If you are signing, pause slightly before fingerspelling a word and slightly afterwards to check for comprehension. When Deaf-Blind people puts out their non-dominant hand, they are requesting (back-channel) feedback. Respond with YES, OH-I-SEE, WHAT-CAN-I-SAY, and so on. Touch is especially important for Deaf-Blind people. It is their link with the world. It can show you are nervous, withdrawn, friendly, tired, or bored.
You may be uncomfortable «holding hands» during pauses, but it is best to wait for the Deaf-Blind person to break contact. It keeps that link and makes it easier for the Deaf-Blind person to get your attention. Later, when you know the person better, touch will also include an occasional squeeze, stroke, pat on the back, walking close, or a hug of greeting and farewell.
Try to think of ways to communicate through touch to make up for smiles and frowns. (Pat hand, #HA-HA, a gentle nudge.) Don’t tease by poking, tickling, or jostling, even though your intentions are friendly. We can see things like that coming and are startled when we don’t. If the person’s hands are heavy, it may mean they are tired or having difficulty understanding. Be aware of a need for a break in the conversation or interpretation. If Deaf-Blind people start a private conversation that you sense they may not want to share with others, remind or inform them if other people may be watching. They may have forgotten or the people may have shown up after you first began chatting.
If someone interrupts your conversation, tell the Deaf-Blind person what is happening and interpret or allow access to the conversation. Don’t leave a Deaf-Blind person waiting during a lengthy conversation. If a hearing person is busy with a Deaf-Blind person, another hearing person can say hello to the first Hearing person who can respond without looking away. When a Deaf person is busy with a Deaf-Blind person, they must break their eye contact and therefore their concentration. It’s better to wait until the Deaf-Blind person is free to look your way. With two Deaf-Blind people chatting, it is even more of an interruption and you definitely should wait until there is a lull in the conversation. Help other people who are new to the Deaf-Blind world learn to communicate with them. Don’t be surprised if people, even Deaf people, are reluctant to communicate with them tactily.
If the Deaf-Blind person needs all fingerspelling, pause slightly between words. Some people use Braille abbreviations. If you will be doing this often, you may want to learn them. Give yourself and your consumer rest breaks when possible.
Print on palm (POP)
Print on palm means to draw the letters of the words, one-by-one, on the palm of the Deaf-Blind person. Some Deaf-Blind people need this for numbers only, drawing a «7», for example, instead of spelling out S-E-V-E-N or signing it. There is a preferred stroke order which is illustrated above. HKNC recommends, «With your index finger, print your message in the palm of the hand of the person who is Deaf-Blind. Use capital letters only, except for the letter ‘i’ which is lower case. Print only in the palm area. Do not connect letters. Pause after each word. If you make a mistake, ‘wipe’ the palm, then print the correct letter. If the person has speech, he or she may say each letter and word aloud as you spell it. This is a good way to know that your message is being understood.» | Walking: In general, the easiest way to guide a Deaf-Blind person is to have them hold onto the back of your upper arm and walk slightly behind you. In this way, they can feel when you turn, stop, step up to a higher level or down to a lower one. You should pause slightly before any change in motion (turning, stairs, change in walking surface) to alert your partner, but you don’t need to explain every little change. If you will stand still for more than a moment, you should explain what the hold up is. |
Crowded spaces: Put your leading arm slowly behind your back to indicate that your partner should move directly behind you if the passage will be narrow for a brief time. If it will be for more than a moment, you can put your partner’s hand on your shoulder and guide them in this way until the passageway clears up. |
Stairs: Some people want to be told that there are stairs, some people just want you to pause, step down or up onto the first stair and then proceed. If there is a hand rail, I usually place my partner’s hand on it and then they can continue to hold onto me with their other hand or use their cane if they have one. | Seating: The guide should place the partner’s hand on the back of the chair where the blind person will sit. This assumes that there are no obstacles to the seat, such as theatrical seating that is not on the aisle. |
Activity 4: Now that we have taught you the standard ways to guide for the kinds of activities you will practice now, you can repeat the guiding activity with some improvements. Pick a pathway with your partner that will include doors, stairs, walk around, the bathroom (try to pick someone of the same gender so that you can go in with them, take them back to their seat, your partner should remove his/her blindfold, and switch roles. If you are more adventurous, you could read about escalators, crossing the street, and other things below and try those, too. A good place to practice these skills is a hotel or college building.
Elevator: Inform your partners that you are waiting for an elevator. Ask which floor if you do not know while you are waiting, to make for a smooth entry and exit. Pat their hands when the doors open and when it’s time to exit.
Escalator: Notify them what’s ahead. Walk onto the steps with them and guide their hand to the handrail. Continue to watch behind you for loss of balance. They will notice that the end is near with their cane and the falling away of the handrail, if they decide to separate from you. If you remain together, walk off the escalator without pausing.
Food line: Read the menu to your partners while in line. Give them a tray with utensils if necessary. Get a tray and utensils for yourself as well. Announce each option as you come to it. Your partners may carry their tray with one hand and put the other hand on your shoulder. In this way you can get for both of you at the same time. After seating them as described above, describe where food is on their dish by using a clock face: «Your vegetables are at 3 o’clock, your meat is at 6 o’clock, and your potatoes at 9 o’clock.» |
Sit down/banquet meal: You could read the entire menu to the D-B person, but it is better to tell them the broad categories: «appetizers, seafood, meat», etc. Inform the D-B person when the food has arrived (FOOD, move their hands). Later, when the server asks if everything is alright, don’t answer first because they will leave before your partner has an opportunity to answer. Say, «Just a moment please while I ask.» After your partner has answered, you can ask for whatever you need.
Guiding outdoors: If there is an overhand, such as tree branches or hanging decorations, if you can’t easily go around it, pause, hold the overhang out of the way while you pass with your partner and continue. While crossing a street, pause, inform if necessary, step down, make sure the Deaf-Blind person «follows», and continue. Don’t cross against the light, even if it seems clear. Some Deaf-Blind people love extreme terrain (rocks, sand, sticks). Help them do it safely, if you have an adventurous spirit.
It’s proper to let Deaf-Blind people speak for themselves about their culture. Here are some links to videos that will show the human side of Deaf-Blind life. Watch these and then consider the questions below them.
The story is told in a melodramtic way (that’s part of American news style), but there is a «tragic» element for parents and those who had vision/hearing before. What are the frustrations/disappointments? Some of the forms of denial for newly-aware deaf-blind people: refusing a cane, to give up driving, or to use tactile ASL. Is acceptance possible?
What are some of the things she will miss? What responsibility does that give us as interpreters to include as information in the setting we interpret?
This is an example of a well-adjusted and productive Deaf-Blind person. Intervenors (intervention) = SSPs. Why is there a need for intervention services? What are the communication barriers? What are the misconceptions and discrimination faced by the deaf-blind? Why do deaf-blind people need to get out more in their community?
«Na l’ga’at» (the name of the group) means «Please touch.» What is the role of art in general? How can Deaf-Blind people use art to teach Hearing and Deaf people what their perspective is?
What are some of the emotions that Coco experienced in her poem? How can interpreters/SSPs help as friends and as professionals?
To read recorded messages, there is Braille. (A) It uses a pattern of two columns and three rows to represent individual letters. (B) One finger remains still to hold the place and the other scans over the bumps to read words letter by letter. (C) This makes for very bulky reading materials and Type II has abbreviations to make materials more compact. (D) Braille may be punched out by hand using a template, (E) typed out using a Perkins Brailler, or (F) inputed by keyboard and punched out by a computer onto Braille paper later. (G) CCTV (Closed Circuit Television) has a camera that converts a magnified image onto a screen. This allows a person to read any printed matter or look at pictures. (H) ZoomText is a computer program that magnifies the image that appears on a computer screen. There are various other methods of magnification with lenses.
A | B | C |
D | E | F |
G | H | I |
(J) For those people who have difficulty reading a regular TTY, there are large display TTYs. This technology is on the way out for sighted Deaf people, but some Deaf-Blind people use it for phone calls as well as in the presence of someone who can’t sign. (K) For those who can’t see even a large display TTY, there is TeleBraille, which translated the output of the sending TTY to a grid that pushes up little nubs in Braille patterns. The people receiving can go at their own pace because the Braille stays there until they refresh it by pushing a button, then the next series of Braille patterns appears.
Mobility and orientation
Canes are an old technology to allow blind people to feel the environment in front of them. (L) Folding canes allow for compact storage when they are not need. (M) Radar devices can be attached to a cane that beep or vibrate before a person has actually touched the obstacle. (N) Guide animals (dogs and small ponies!) can be trained to guide a blind person away from danger and even crouch to warn of overhangs!
(O) The technology used for deaf people can be modified for blind people by vibrating or sending a signal to a wrist band that vibrates. These receiver-transmitters are used for alarm clocks, doorbells, fire alarms, and so on.
J | K | L |
M | N | O |
OTHER COMMUNICATION SYSTEMS
«Tadoma is a method of communication used by Deaf-Blind people, in which the Deaf-Blind person places his thumb on the speaker’s lips and his fingers along the jawline. The middle three fingers often fall along the speaker’s cheeks with the little finger picking up the vibrations of the speaker’s throat. It is sometimes referred to as ‘tactile lipreading’, as the Deaf-Blind person feels the movement of the lips, as well as vibrations of the vocal cords, puffing of the cheeks and the warm air produced by nasal sounds such as ‘N’ and ‘M’. It is sometimes used in the United States, but it is rather rare. In some cases, especially if the speaker knows sign language, the deaf-blind person may use the Tadoma method with one hand, feeling the speaker’s face; and at the same time, the deaf-blind person may use their other hand to feel the speaker sign the same words. In this way, the two methods reinforce each other, giving the deaf-blind person a better chance of understanding what the speaker is trying to communicate.» In addition, the Tadoma method can provide the deaf-blind person with a closer connection with speech than they might otherwise have had. This can, in turn, help them to retain speech skills that they developed before going deaf, and in special cases, to learn how to speak brand new words. Image: «Patrick Dowdy and Robert Smithdas using the Tadoma Method», |
«LORM is nor just a random aggregation of several abbreviations but it is a name – a pseudonym of a Deaf-Blind poet, philosopher and battailous journalist, native from the Southern Moravian city Mikulov Heinrich Landesman (9. 8. 1821 – 3. 12. 1902), who due to persecutions used pseudonym Hieronymus LORM. Due to a sudden loss of hearing ability at the age of 15 and due to a slow worsening of his sight Hieronymus LORM became a creator of one of the systems of hand touch alphabet of the Deaf-Blind, so called Lorm’s alphabet, which is still in use in the CR [Czech Republic] as well as is many other countries.
Image: The Lorm Deaf-Blind Manual Alphabet.
for B, strike downwards along his index finger from the tip till just above his palm (so don’t touch the palm).
for C, touch the middle point of the lowest part of the palm of his hand.
for D, strike downwards along his middle finger from the tip till just above his palm (don’t touch palm).
for E, touch the tip of his index finger.
for F, gently squeeze together the tips of his index finger and his middle finger.
for G, strike downwards along his ring finger from the tip till just above his palm (don’t touch palm).
for H, strike downwards along his little finger from the tip till just above his palm (don’t touch palm).
for I, touch the tip of his middle finger.
for J, gently squeeze the tip of his middle finger.
for K, touch with all your tips together in the middle of the palm of his hand.
for L, strike downwards with three tips along his index, middle and ring finger from tip till just above his palm (don’t touch palm).
for M, touch with three tips the higher part of the palm of his hand in a horizontal way.
for N, touch with two tips the higher part of the palm of his hand in a horizontal way.
for O, touch the tip of his ring finger.
for P, strike upwards on the outside of his index finger.
for Q, strike upwards on the outside of his hand (on the side of the little finger).
for R, drum slightly with several of your tips in the middle of the palm of his hand.
for S, make a circle in the middle of the palm of his hand.
for T, strike downwards on the outside of the thumb.
for U, touch the tip of his little finger.
for V, touch with one tip in the palm (left side) between the basis of the thumb and the index finger.
for W, touch with two tips (in a vertical way) in the palm (left side) between the basis of the thumb and the index finger.
for X, strike horizontal (from left to right) over the wrist.
for Y, strike horizontal (from left to right) over the center of all the fingers.
for Z, strike horizontal (from left to right) over the center of the palm of his hand.
Tactile Fingerspelling in England
British fingerspelling uses two hands, but for tactile fingerspeling the receiver holds out a flat palm and the sender uses a modified system where the letters are spelled onto the receiver’s palm. See a comparison of the two below. I was surprised to learn at an American Association of the Deaf-Blind conference from a Deaf-Blind Brit that although there are some Deaf people who used to sign, once they become Deaf-Blind, they all revert to tactile fingerspelling.
For more information on Deaf-Blindness in America and around the world, see my Resources for working with Deaf-Blind people.