TETN 20442 O&M and Assistive Listening Devices This video is posted online with the following chapter markers: Chapter 1. Introduction - Introduction of moderators Ruth Ann Marsh, Statewide O&M Consultant for the TSBVI Outreach Program; Robbie Blaha, Educational Consultant for the Deafblind Team; and Lisa Sutherland, Educational Audiologist. Chapter 2. Types of Hearing Loss - Brief overview of the most common types of hearing loss; conductive, sensorineural, and mixed. Chapter 3. Assistive Listening Devices - Brief discussion of how digital assistive listening devices are programmed to enhance sounds, especially speech. Chapter 4. Collaboration - The importance of communication and team work between the family, O&M specialist, teacher for deaf/hard of hearing and the audiologist. Chapter 5. Cochlear Implants - Brief overview of the technology of cochlear implants and the increasing numbers of children who are being implanted. Chapter 6. FM Systems - FM Systems or Bluetooth devices could be used effectively by O&M specialists to give students direct instructions through a compatible adaptive listening device. TETN 20442 O&M and Assistive Listening Devices Transcript [ Slide start: ] Chapter 1. Introduction Ruth Ann Marsh: Okay. I'm Ruth Ann Marsh. [ Slide end: ] I'm the state-wide O&M consultant for the Texas School for the Blind Outreach Program, and I want Robbie Blaha, who's also from Outreach to introduce herself. Robbie Blaha: I'm Robbie Blaha, I'm here today as the ‑‑ as the teacher of the deaf/hard of hearing and also a member of the Texas deafblind team. I wanted to quickly say that my experience with the deaf/hard of hearing, I taught two years at Texas School for the Deaf. I taught regional day school for three years and then I was an AI itinerant in two districts, and that's my background as a teacher of deaf/hard of hearing. Marsh: Okay. The focus of this TETN is about teaching identification and use of environmental sounds when you are working on O&M skills with students that are deafblind. Robbie and I watched a TETN last year and it was done by Susan Tiggs, about hearing technology and how hearing works and everything, and we started talking and thought it would be good just to kind of tag on to that previous TETN. Blaha: I think another person we need to introduce is Lisa Sutherland, she will be on the videotapes that you see. Lisa is an Audiologist and she's an Educational Audiologist which is a field of specialty. And has worked for regional day schools and ‑‑ in the Central Texas area. Marsh: We were really fortunate to get her to participate in this. I personally learned a lot from her and I think she learned something from us because we ‑‑ we really come from totally different perspectives and that's one of the things that both of us learned. So we're going to go ahead and start out with the videos. Chapter 2. Types of Hearing Loss [ Video start: ] Marsh: Hi, I'm Ruth Ann Marsh, the state‑wide O&M consultant for the Texas School for the Blind and Visually Impaired, today we have Lisa Sutherland who is our guest speaker and who is going to help with a lot of information. Lisa, you want to tell us what you do? Lisa Sutherland: I'm an Educational Audiologist. I have a company that contracts with school districts to provide support for kids with hearing loss. [ Slide start: ] There are three main types of hearing losses. [ Slide end: ] There's conductive and there's sensorineural and there's mixed. And I think probably -- well, the most important thing for people to understand about the differences between those is that the conductive hearing losses are typically due to a problem about the sound being blocked somewhere either in the ear canal or in the middle ear space. While sensorineural hearing losses are due to a problem either in the organ of hearing, the cochlea, or the nerve as it travels to the brain. Many... many conductive hearing losses can be treated medically, either with surgery or medication if you have an ear infection or you had wax blocking the sound. And sensorineural hearing losses are typically not treated with surgery or medicine. Conductive hearing losses, if it is deemed that medical intervention is not going to help resolve the hearing loss, they may be fit with hearing aids, although the majority of people that use hearing aids have sensorineural hearing losses. The conductive hearing losses, assuming the sound is loud enough and it can travel through that middle ear system, it will get to the cochlea and the nerve and travel to the brain and it will be beautiful and clear, it's just a matter of it being loud enough. The difference between that and a sensorineural hearing loss is that those losses you can make them loud, and pull those sounds into an area that's audible for the person that has hearing loss, but they are not always clear because of how the cochlea or the nerve is sending that signal up to the brain. So while we see benefit with hearing aids, the expectation of how clear that sound is is going to be very different between someone with a cochlear hearing loss and a conductive hearing loss. Folks with conductive hearing losses that use hearing aids typically do really well; it's clear, it's beautiful once they have the hearing aids on. But the sensorineural hearing loss folks may still have difficulties, typically still have difficulties discriminating the sounds that they are hearing because the sound is being distorted by the system. So you can have that purely conductive, you can have a sensorineural or mixed is just a combination of those two. And then we talked earlier a little bit about central auditory processing, or auditory processing disorders. And I just wanted to touch on that. That is kind of a fourth type of hearing loss that you hear a little ‑‑ you don't always see when listed types of hearing loss, but it is out there and you see it more and more lately, being diagnosed. And that's how the central nervous system processes that auditory information once it gets to the brain. So that's another type of hearing loss that we often see; and don't typically use hearing aids with. But may use other types of assistive listening devices to help them understand speech in noise. Marsh: Okay. So in an audiology report it would say what type of hearing loss it is? Sutherland: It should, it should. In the state forms that have the A and B, there's a section in both part A, the physician's part and the Part B, the audiologist's part, where it will say type and severity and type is where they should be saying if it's sensorineural or conductive. [ Video end: ] Blaha: There was a couple of things that I wanted to comment at this point, about. When you haven't orientation and mobility instructor working with an individual who has a hearing loss and uses assistive listening devices, these children are considered deafblind, of course, and one of the things we know about deafblindness is when people say a conductive loss, can usually be treated with amplification and the hearing is very clear, or someone has a mild hearing loss, all of this information is based on people with sight. You have the ability to visually compensate for any problems that they might have; and receiving sound. So ‑‑ it's ‑‑ kind of be careful when you are looking at an audiological if it says a mild or moderate loss, those are functional terms and they are based on hearing impaired people who have sight. And one thing that they have found by doing brain scans on deaf people who use sign language, is that if they have a profound or severe hearing loss, the parts of the brain that process sound; it's primarily located in the temporal lobe, on the upper sides of the temporal lobe and if a person has no hearing, parts of that ‑‑ of the brain take on visual jobs from the occipital lobes. So some deaf people really have super vision almost, and are very attuned to very slight nuances visually that other people aren't. So please take that in mind if someone says this child has a mild hearing loss, the assumption is that it's mild if you can see and see really well. And when you are visually impaired or blind, then low vision or blind and that's not the case, you aren't very efficient in compensating at all. The other thing that I wanted to mention, they talked about a central processing disorder. And usually that is caused by some diffuse damage to the areas in the upper temporal lobe and it's usually considered bilateral. And that's caused by infection or anoxia and things that we associate with a CVI. You can also have a stroke or a lesion in one side but you see more damage when both sides are involved. That's a central auditory processing disorder. And I wouldn't be surprised if kids with cortical visual impairment also have this condition because what causes one can cause the other. Then there's another kind of loss called auditory neuropathy that can sometimes be referred to as central, but it's really about the 8th cranial nerve. The damage is to the nerve, and so you have auditory neuropathy. The signal is going through the middle ear, going through the cochlea, but the damage is to the auditory nerve. And so any of these things can really make it difficult and challenging for orientation and mobility instructors who really, really [ Slide start: ] count on kids being able to use what they hear. To be safe and oriented and successful travelers. [ Slide end: ] Chapter 3. Assistive Listening Devices Blaha: Oh, yes. Okay. The next clip is going to be about hearing aids. And assistive listening devices that as an O&M instructor you might encounter as you serve these students. [ Slide start: ] [ Slide end: ] [ Video start: ] Sutherland: With hearing aids -- I know we were going to talk a little bit about hearing aids and the different programs that they do have. And most hearing aid technology now is digital. I think it's about 90 percent from what I was able to find out. And that ‑‑ and that digital signal allows us to manipulate the sound and ‑‑ in ways that... traditionally we have thought have been helpful in helping speech understanding. So most of the kids, certainly kids you're going to see digital hearing aids on and a lot of times those have different programs on the one device. So you have your hearing aid, but you have a button that you can touch and it flips how the sound is being processed to a different way, and those might be programs. You might have a program for just your normal program, your program for listening in noise, your program for listening to music or using your FM system in school and those are all different programs that ‑‑ that can be set by the audiologist and are not visible. Even if you were familiar with one type of hearing aid, you could have a ‑‑ the same exact model of hearing aid, but the programming would be completely different. That's why we usually like them, because we can change it around, or if the hearing changes, we have more flexibility. Used to be your hearing changed and you might not be able to use that hearing aid that you just got. Now we have more flexibility in being able to alter how the sound is being processed to try to make changes and accommodate different settings. [ Slide start: ] [ Slide end: ] Marsh: The programs that you just listed are all related to speech. Sutherland: Yeah. Marsh: What about environmental sounds? Because that's what we are helping them learn how to use in order to make judgments about traffic. Sutherland: So, it's been so interesting for me to talk to you about listening for environmental sounds because as an audiologist, we've always been focused single-mindedly it seems, at least I was, on trying to make speech audible. I mean, that's what it's all about for audiologists, right? [Laughter]. Trying to make speech audible. Marsh: Speech is important. [Laughter]. Sutherland: So we have done -- a lot of the strategies that we employ think of environmental sounds as the enemy of audibility of speech. It wasn't until we started to talk thinking about all of these programming strategies that we have employed that try to cut out environmental sounds. Because we do know that having background noise makes it very difficult for people to listen and certainly people with hearing loss even more so, and kids with hearing loss the most of all. So we've done all of these things to try, we make assumptions about which sounds we don't want to hear. We don't want to hear that low frequency hum in the background, because that's not speech, and we start trying to employ different processing strategies in the hearing aid to cut that information out. So it wasn't until we started talking and I understood a little bit more about how O&M specialists are trying to use -- help people use auditory information in the environment that I realized that the way that we are traditionally setting hearing aids -- if we are not thinking about these other kind of things that we want people to be able to hear, that we are really doing a disservice and actually making things much more difficult by using the hearing aids. One of the examples that I'm thinking of is directional microphones. The idea behind directional microphones, traditionally, is that usually what I want to listen to is in front of me. So the person is talking, my head is turned towards that person and the microphones are providing like tunnel hearing. I'm trying to focus on you and the person that's talking over here I don't want to hear and the thing that's back behind me I don't want to hear. And so we're really -- we use that strategy a lot to try to help understand speech in noise. But it seems to me... like that is exactly counter to what you are needing ‑‑ [Laughter] ‑‑ when the kid or the adult is out in the world and we're trying to use 360‑degree auditory information to help them orient themselves in space. And so it's really got me thinking about not getting rid of some of these strategies that are trying to help speech and noise, but having separate programs, so that the second program is trying to help people use that, and not cut out that important environmental sounds; but allowing that environmental sound to remain so that it can be used to help negotiate the world, or be aware of ‑‑ of the car behind you or ‑‑ a lot of people don't like using directional microphones with kids. Certainly little kids. I mean, they figured ‑‑ they realized that having directional microphones on a baby might not be the best scenario in some cases. The microphones are not necessarily facing the direction that we want them to and we want them to get incidental information that's coming from places that are not directly in front. But I think as ‑‑ as people get older, there's more of a tendency to use directional microphones, because of the benefit that we have seen with speech. But I think it ‑‑ it needs to remain in ‑‑ in an optional program so that when you are out in the world, you are getting that information. Marsh: That's absolutely vital for a child that has no vision and can't take in incidental information visually, that they have the ability to be able to hear all around them and learn how to use that information. [ Slide start: ] Localization is a really huge factor that fits into this. [ Slide end: ] Being able to know where a sound is coming from and then being able to track moving sounds is really, really important in O&M. So what my experience working with people that have hearing aids is localization is not an easy skill for them to do. Sutherland: It's not. I mean the first thing take always comes to mind with localization for me is having binaural hearing or hearing in both ears. So, we know that having the time difference and the intensity difference of the sound as it gets to your ear is how your brain helps you figure out where a sounds is located. So if you have one ear that hears better than the other, the information that you are getting is going to be confusing about localization or if you have one hearing aid. You know, you sometimes say oh, you know, one is fine, you're okay, you've got the one. But having that binaural that helps us both be able to pull out speech in noise because I can't let go of that, [Laughter], but also to be able to orient yourself in space and that's why many years ago you only saw cochlear implants in one ear. And now you're starting to see binaural cochlear implants. Marsh: How interesting. Sutherland: Yeah, it's because we realized how important having those two were, not just because somebody might be standing on this side that you don't have an implant, but also because it helps you localize, it helps you discriminate in noise. So that localization is a piece that I can understand how important that is, but it's not something that I ever thought about when I was fitting hearing aids. We never tested for it. We never really even thought about or I didn't think about it. There may be people out there who are doing a better job thinking about it. When I was fitting hearing aids I really wasn't. I was so focused on speech understanding. So the directional -- the binaural fitting really important, the directional microphones are thinking about at least trying different settings because after looking at some research I realized there was one ‑‑ I thought, oh, I've got this figured out, directional microphones are not good in this situation. There was one study that showed that people actually did better with localization with directional microphones on than they did with omnidirectional microphones, which surprised me. So I think we're still learning. And it may be very individual, too, on how people use this information. One of the things... that they mentioned that seemed important, too, to keep in mind was that people, not surprisingly, learn how to use this information. So once there's been a change in a hearing aid setting, that there needs to be some time to ‑‑ for you to get used to this and learn how to use this information. Or to compare the two different programs to not just like oh, get Omnidirectional now, you should be fine, but that the people are going to have to try to learn how to use that, and practice with it. Marsh: Well, that brings up a really important point when you are working with someone that's visually impaired and has hearing aids or cochlear implants. Because these are not skills that have had programming for them, I have requested programming for these, and then it takes a long time, lots and lots of practice, lots of trial and error learning how to use the information so you can localize and you can follow a moving sound. And for some people -- I've had some people that just never quite were able to do it. So it's extremely individualized. Just like vision is very individualized, hearing is very individualized. Sutherland: Yeah. I also read some research that showed that people with sensorineural hearing losses have more trouble with localization. So even if you have good settings on the hearing aids, that's going to be a challenging thing for people to be working on. They should be working on it. But that there shouldn't be the expectation of even if we have good hearing aids, and they're set just right, that ‑‑ that it's ‑‑ it doesn't mean that their ability to do that localization is going to... be equivalent to somebody who has hearing within the normal range. So ‑‑ and that may even be with a ‑‑ with hearing that's similar in both ears. When you have asymmetric hearing loss, or one ear is different than the other, then that information, it's going to be challenging. They still may be able to learn how to use the auditory information to localize, but it's going to be a difficult task to try to work on. So, I think these are all things that we need to think about besides the microphone array and the two hearing aids; [ Slide start: ] another strategy that audiologists like to focus on is compression. [ Slide end: ] And so what we've decided... as a profession is what we want to try to do, generally speaking, is make soft sounds a lot louder so you can hear it, but loud sounds not a lot louder so they don't get too loud. So you are taking a whole range of sounds, so for someone who has normal hearing or hearing within the normal range, this might be the softest sounds that I can hear, this might be the loudest sounds that I can tolerate. It's pretty big that dynamic range. But for someone who has a hearing loss, the softest sound they can hear might be right here and the loudest sounds they can tolerate is still at the same level. Because people that have hearing loss still sometimes even more so, sometimes it's even lower, are sensitive to very, very loud sounds. So you have to take this whole range of information and try to cram it into this very small dynamic range. And to do that, you need to compress the sounds. So you have to make the very soft ones a whole lot louder and the very loud ones just a little bit louder. But which sound initially like a good thing until I started to think about what we were talking about. And how we use the intensity of sound to help us judge distance. And I started to wonder what is happening -- and I don't have the answer. But I started to wonder, what is happening when we take a sound that would normally be very soft, because it's distant, and we've now made it louder and the thing that's a little bit louder, we haven't made that much louder, and how confusing that would be in terms of trying to judge distance. And so one of the things that I read suggested that we leave, if we can, the low pitches, where a lot of environmental sounds happen, as linear as possible in the hearing aid, meaning we're adding this much loudness to the soft sound and that same, this much loudness to the little bit louder sound and to do that through some of the low pitches and then where people have tolerance problems a lot of times or even the greatest degree of hearing loss is often not always up in the high pitches, we can compress up there some. But to try to keep it more linear in the lows, and I don't know if that's the answer, but it's an interesting thought and I think something worth trying with people to see if that helps with the localization. [ Video end: ] Blaha: I think kind of my take away on the conversation with the audiologist in terms of best practice for individuals with deafblindness who are receiving orientation and mobility training, one thing is to ‑‑ as an O&M, to let the team know that you really need binaural hearing. That this child really needs to be aided on both sides. In deafblindness you really need the additional information, because you cannot use your vision to compensate and tell where things are located, which side they were on. If you are deaf you're going to look and know where the intersection is in front of you, the band is to your left, you know, because you can see. When you cannot see, then you need hearing aids on both sides or amplification on both sides. If that ‑‑ if you don't have that, then every ‑‑ for example, I have a hearing aid in my right ear. And nothing in my left. So every ‑‑ this side is picking up noises, but everything sounds like it's on my right side. That's where the sound comes in. Even if the band is on the left and the cafeteria is on the left, everything sounds like it's on the right so they can't localize without binaural amplification. And they will -- the only way they're going to get it is if people advocate for it, and say, "For this skill, which is critical;" because I can't ‑‑ I think O&M is a critical skill for anyone with a visual impairment, but I -- for a child with deafblindness, I think that the orientation and mobility training is hugely important. And that it's a field of study that's just as important as speech, developing speech. And so I think that while one hearing aid might be fine for speech, acquiring speech, for travel it's a disaster. And so best practice for children with deafblindness is binaural amplification. The second thing she said is working with the audiologist to have programs set up. To say yes this one program is set for the classroom. And all of the environmental sounds are gone and the ‑‑ they can hear the teacher perfectly. This is great for the science lecture. But when I go out, I need another program set up for me for the orientation -- the orientation mobility instructor would need another program set up to ‑‑ so that the child could gather environmental sounds because that's what ‑‑ that's so much of what you do. So asking the audiologist to put in additional programs. This is a possibility and I think that it's really essential if you're going to provide effective services. Training for the student. The student again needs ‑‑ is going to need a lot of auditory training, and don't assume that this is happening typically, just because they have hearing aids. Auditory training is pretty simple for children who are deaf. When I was ‑‑ I had elementary school children as a teacher of the deaf. We did auditory training because they didn't know if they heard something that ‑‑ any sound that you hear there's a reason for it. They don't know that, when you are little. So you teach deaf children to like if we heard it, let's go see what it is. And you instill this responsibility in them. If you hear a sound, let's check it out. And they start learning what the different sounds are. And it's ‑‑ it's pretty ‑‑ it's ramped up pretty early in their education and they're good at it. Too good at it. Sometimes if there's a sound, everybody is up and out the door in the classroom. But children with deafblindness, the training is essential. It is not ‑‑ they can't pick it up incidentally, it has to be specifically trained. So it's going to be very important to get up, to get in the IEP auditory training goals and objectives. Don't assume they're going to be in there, even if the child is, you know, coded AI and has a teacher for the deaf and hard of hearing, don't assume they're going to know to do auditory training because all of the training I had -- it's targeted on very young children and the skills that you keep building as the child matures are based on speech, speech, receptive speech, suppressive speech, monitoring your speech, listen to your speech. And not environmental sounds. And so this is ‑‑ getting it in the IEP for training is the second best practice. Binaural hearing, getting it in the IEP, working on programs. Getting different programs set up in the hearing aid are going to be really important. And this compression thing is a big deal. So use that term. When you talk to the audiologist and please talk to the audiologist. Like Lisa said in the tape, if you tell the parent tell her I want a ‑‑ I want a program with, you know, noise in it, the audiologist doesn't know what O&M is. They don't know what that is. So they're not going to understand the need for it. And it's a critical need, and you know it because of your training. But as an audiologist, doesn't work with O&Ms. Okay. So on the compression piece, what she was saying is typically hearing, you have a huge array of sounds that your cochlea -- your cochlea can interpret and send off to the auditory nerve. You have a damaged system, you have got this much good, good function in your cochlea and your auditory nerve, your middle ear and so everything has got to go in there. So they're going to raise in one program soft sounds and again thinking of speech, many of our speech sounds, are soft sounds and they're going to raise those soft sounds and make them as loud as possible. So soft sounds also ‑‑ to you all -- to O&Ms mean distance. If it's a faint sound, it's that the car is far away. And so with compression, that car is not a block away anymore. It doesn't sound that way. It sounds much closer. And so they are getting ‑‑ in terms of localization, distorted information. So asking for a program, say "I need traffic sounds to have a different type of compression. I need them to have a range of softer to louder." And asking that the compression be altered in one of the programs. And they will be so amazed that you know about compression and they will be so amazed -- as this audiologist was, she was fascinated. She kept going, "I have never thought about it this way." And made a comment one time, she said, "As an audiologist, we don't test for localization, that's not part of an audiological." And so If you don't raise that issue, it's not going to be compensated for with the assistive listening device. Marsh: Compression was totally new to me. I had never, never heard that term. About ability for hearing. It's just enormous for O&M because we use the softness and the increasing loudness or the decreasing loudness back to softness for knowing what's going on around us. And I had no idea that it was artificially adjusted for many people that have hearing aids. And that it could be something that we could have a program set to take away that artificial adjustment. I wish I had known that when I worked with many of my students because I thought that they just couldn't make that discrimination between a sound that was close and a sound that was far. But it very possibly was a hearing aid setting and I didn't know that. Blaha: Because you are working with a damaged system, there is a point to which -- you're going to hit the end of your ability to improve their situation, but you never really know where that is on these kids because of the training issue. You don't know -- are they maxed out and there's nothing we can do at this point? Or do they need more training? And so that's why the training piece is so important and to give them extra time. But a hearing aid will not restore hearing to normal. It does not. And at some point they're going to reach a wall that ‑‑ that they simply can't get any more information or use it any better, because of the damage to the ear and the limits of amplification in 2014. So ... Marsh: But before you decide they've reached that wall, check with the audiologist first, to see if maybe there's a different program that would make a big difference. Okay? We'll go on now. Chapter 4. Collaboration [ Video start: ] [ Slide start: ] So as an O&M specialist, I would talk to my AI teacher, or maybe even directly to the audiologist, or have the family talk to the audiologist and explain what I'm trying to have them listen for -- that traffic from far away is much softer than traffic that's close by, and ask them to make a program where the softer sounds are not made really close in intensity to the louder sounds, is that right? Sutherland: Yeah. Marsh: Okay. [Laughter]. Sutherland: I think it would be. You know... in a perfect world, I would love ‑‑ I want the family to be part of that conversation. But I would love if there was direct communication too between the O&M specialist and the audiologist. Because I feel like... sometimes the translation, in both directions, once it goes through the multiple people, I can imagine myself being at the office and having the parent come in and say "The teacher said he wants my kid to be able to hear noise." And me going "What?" You know, "I don't understand, why would you want to do that?" So I think being able to share the goal... and even asking ‑‑ there may be lots of other programming strategies that I haven't even thought of when I've been trying to troubleshoot or think about how to address these issues. But I think if the O&M specialist, or the family, or both can go to the audiologist and say, "These specifically are the sounds that I want my kid to be able to hear." Because the audiologist can then translate into wow okay, all right, what kind of sound is that? You know, where does it lie in the frequency range? How loud is it? And how can I play with those hearing aid settings to try to increase that? So the specific -- it might be specific to that child's environment. I mean, there are some things that seem fairly universal, right? The traffic noise. But there may be other sounds that I don't know trying to think of a specific doorbell that you are orienting to, or some other auditory landmark that is specific to that person, like we know that this thing that's generating sound is located in this one location. And we really want them to be able to hear that specific thing. I think that information would really help guide how the audiologist is setting the hearing. Marsh: It would be fantastic if O&M specialists could go with the child to the audiologist. That isn't always possible and I think that's one of the reasons why it's really important to have a relationship with the AI teacher or deaf and hard of hearing teacher, because it seems like in some ways we work at cross purposes with the settings -- the normal settings of the hearing aids. And also we kind of speak a different language, too. So if the ‑‑ if you have an AI teacher to go to, they can help with talking to the audiologist or maybe even explain things themselves. Sutherland: And I think that AI teacher is a really important piece in this, too, in that some of the things that you're talking about fall into what the O&M specialist would be working on, and some of those things could be goals for if the student qualified for services, specific IEP goals addressed by the teacher for the auditorially impaired. In an ongoing way that's not possible in a clinic setting with the audiologist. And I know that one of the questions had to do with like, "Well, who teaches the family about the different settings?" and who ‑‑ and I'm going to say the audiologist usually does. But it's really fast. You've just got your hearing aids, I'm going to show you all of the things your hearing aid does, how to change the battery, what the different programs do, all of that stuff. And then you're going to be out the door because the next person is coming in. And so you've got very little practice with it and certainly nothing in the real world. But I think the AI teacher can be the person that helps work on some of those skills with the student. Marsh: And teaches the O&M specialist how to do it. [Laughter]. Because we have them out in the community where we can't run down the hall to the AI teacher or even the classroom teacher who probably knows how to make some of the adjustments. It's really important for us to know at least some real basic stuff about it. Like changing batteries. The child needs to learn to be responsible for the device themselves. But... when they are really young, you know, and we have them on ‑‑ let's say out on the far campus somewhere, working on a skill, that we can realize that the battery is not working or needs to be changed or something. Sutherland: I know that self advocacy is always -- or frequently, always should be a piece of what the child is working on. But yeah, they're little, and so it's an ongoing -- even with some of the older kids I work with, it's an ongoing process of working on both taking care of the hearing aids, but advocating when things aren't working, mentioning it. Sometimes they'll just go okay, "It's not working now," and they may not say anything. And ‑‑ and knowing ‑‑ knowing again got me thinking about you are using all of this auditory information to make your way through dangerous environment and you don't think to bring extra batteries in your hearing aid, and now you've got to make your way back. So those are things that, you know, we might say in the clinic. "Oh, make sure you have an extra battery," and that's about the extent of it, of how much we've discussed it. But... having that -- working on listening checks -- we like to teach teachers how to do listening checks with equipment, because a lot of times somebody will say to the kid, "Is your hearing aid working?" And the kids is like "Uh‑huh." [ Laughter ] And you know, you don't really -- "Is the FM on?" "Uh‑huh, yeah." Or, I always liked, "Did you -- can you hear me?" You know, I'm sitting like -- "Can you hear me?" And they go, "Yeah." So... so we like to try to both work with the kid but also with the teacher to use the battery tester, use a listening stethoscope, where you can listen to their hearing aid, do comprehension checks with the student that aren't like, "Did you hear everything that I just said?" and they say "uh‑huh, yes I did." But asking them a question where, "What is it that you are supposed to be doing right now?" instead of "Did you hear me?" You know, and so all of that is stuff that the AI teacher can be working with ‑‑ that can be reinforced by O&M, too. And ultimately benefit the student when they start to take that on. Marsh: Certainly would be important for ‑‑ to make sure that hearing aids were working before we took them out on a lesson. So knowing how to do a listening check and use a stethoscope, I have no idea how to do that. [Laughter]. Sutherland: But you'd have to have access to one, right? So it would maybe be through the AI teacher and that regional day school program or if they are not connected, you know, that audiologist, "You know, we need a listening stethoscope at school." You know, that's not an expensive thing to get. [ Video end: ] Blaha: I think this collaboration piece can be -- it's real important and it can be really tricky. So this is a piece that I would like to talk about. When children in Texas are coded deafblind, by law you have to have a TVI and a teacher of deaf and hard of hearing at the ARD meeting. So these kids have ta eacher of deaf and hard of hearing that you can consult with. And I would like to talk a little bit about ‑‑ when I was an itinerant AI how it was helpful for people to work with me. Okay? Because I think the itinerant model is a little different than the self contained regional day school for the deaf model. We have kids in both settings. But most of our kids are out in districts with an itinerant who may or may not be a part of the regional day. I wasn't. I was part of the regional day self contained but once I became an itinerant I was a private contract itinerant for the county. [ Inaudible ] But it can be either way. But I think that -- first I wanted to say, you hear AI teacher -- teacher of the AI and teacher of the deaf/hard of hearing. The reason you hear those two terms, culturally the deaf and hard of hearing do not like to be called auditorially impaired. It's to them an insult, because they don't perceive themselves as impaired, they're deaf or they're hard of hearing so they believe they can name their disability. Because they have it. And in Texas, what you hear is deaf/hard of hearing because it's considered rude to say an AI teacher. So yesterday I was in a meeting at the school for the deaf and we had two specialists for deaf ed from two service centers there, and they introduced themselves as being the DHH specialist at the service center. And so you'll see these terms interchangeably. In federal law it says auditorially impaired, that's why all of your paperwork that comes off of the software is AI, but you usually refer to  the teacher as the teacher of the deaf/hard of hearing or teacher of the deaf. I think that... to have a good collaboration with a teacher of the deaf, the first things that you are going to need to do, as an O&M, this is something that you have learned, you have to be able to give somebody a short description of this is what I do for a living, this is what I'm going to do for the child. Because, the teacher of the deaf has not typically worked -- now they probably have seen someone with a cane... travel, but really have not worked with an O&M typically. You are going to have to say, "This is what I do." first of all. Then I think it's important to say these are the three skills I'm going for be working on with this student; localization, discrimination, and identification. Because that ‑‑ you're speaking their language now, because they do absolutely know about auditory skills. And then explain that you are working outside, and not so much on speech but on environmental sounds. And that you need your help to do -- their help ‑‑ you need their help to do your job. And one of the questions I would ask the teacher of deaf hard of hearing is to explain the hearing loss to you --functional. Say so, given -- does this child have a conductive loss, sensorineural loss, how efficiently does he use his hearing in class, does he use his aids, are they well maintained, does he have any, you know; and talk about the device with them and then ask them, "What was this child's hearing loss? What would be the effect on them learning how to identify sounds? What do you think about localization? How good do you think his discrimination can be, given just the hearing loss?" And then you might want to double it, in terms of complexity, by it, when you add the vision loss, because you never ‑‑ we always say, "You never" -- "It's not deaf plus blind, it's deaf times blind." because what they ‑‑ when they're talking about, well, typically a person with this kind of hearing loss can do these things, they're talking about sighted children who can speech read, who can read facial expressions, who in a crowd of people can shift gaze, they can shift gazes all around, trying to see who is talking. They can also identify ‑‑ auditory sounds, speech reading, facial features and work in a group. Those are the four skills that you usually see coming into compensate for hearing loss. There are a few others, but those are the big four. And I look at those big four, I know a lot of kids that I work with who do not have the vision to do those four things. In fact there was one study that said if your visual acute was 20-100 you were not a candidate for speech reading; and they were talking about speech reading when you're sort of sitting next to someone speech reading. They're not talking about someone who is 12 to15 feet away at blackboard. And so, talking about how complex the job is going to be for this student to do these three skills and apply them to environmental sounds and to travel. Another thing that I would do is ask for training on the child's equipment. I would say, "Will you show me how to do a sound check?" Because you don't want to head out on a lesson without having done a sound check on that hearing aid. If you do not know how to do one, if you do not have the equipment, which is a stethoscope -- they're not expensive. It 's not the type that you listen to your heart with. It' got a little suction cup on there, where you place the aid on it, so you can listen to see if it's working at all, or if there's static, or from one time to another, suddenly there's no gain, and that it's broken, and it's not really amplifying, or working and it seems different than before. All of these things -- the child can't make these calls. I mean, they have a hearing impairment, so they may not be able to tell all of the things that can go wrong. I would ask to be shown how to do a listening check. And just as a fun fact to know, for the next party you go to, there is a federal law that says in IDEA that an agency that has a child with a hearing loss and if they have a cochlear implant, the external device has to be checked. So it's a law, that the school has to be doing daily listening checks on these kids. Marsh: Daily. Blaha: Daily! [ Laughter ] And they encourage teachers of the deaf, when you have your own class, usually do one in the morning, you do one after lunch, because something always happens on the playground, you know, or at lunch where, you know, they drop their aid or it goes into a beverage. You always check twice a day on the aids. And you need to be able to troubleshoot it. I know it's the child's responsibility to learn how to manage their aids. That's Hearing aids and hearing devices. That's part of the deaf Expanded Core Curriculum, that under auditory they have equipment management and from the get‑go, from young children you give them increasing responsibilities. But I don't know where that child is in terms of his ability to be responsible and where I have to step in to be responsible for that, you know, device, which could cost a cool 5,000 each, that you are taking the child out for an hour. I think that knowing how to troubleshoot the aid and use it is important. When it comes to adjusting the aid itself, that is not something that a teacher of the deaf/hard of hearing can do. They know basically in terms of amplification devices, we maintain and train. That's what we do with the devices, we make sure that they are working, we know whom to call when they are not. But we can't really change any of the adjustments at all. That has to be the audiologist. So I would say, "Who is the audiologist for this child? I need to talk to them," and get a release signed. You know when you are doing assessment anyway on these kids, get a release to talk to the audiologist, get them on the phone and go through your spiel again, "I am an O&M instructor. This is what I will be doing. I want to hear environmental sounds, and -- or the student needs to hear environmental sounds. I'm working on these three skills, one of them is localization." This is something audiologists don't touch. But they can, they know quite a bit about it. They're kind of sounds geeks, they get real excited when you want to talk about localization of these things, they really enjoy talking about the dynamics of sound and learning to use them, learning how to use the device to help with that process. So talk to the audiologist and go through those four things again. "I want binaural hearing, we may need another additional aid on the student, a student with deafblindness, and it's best practice, we need two. I would like to know if you are using a directional mic or an omnidirectional mic, because I prefer omnidirectional when I am out in the community. I would like to know if you could set up some different programs for me because I understand the classroom one is set up, but I need one set up for when I'm doing the O&M lesson, and I need to know how to shift programs on this device." So when you are working as part of this initial step in assessing and developing a training program for the child, I think it's very important for the O&M to talk to the audiologist. I think -- I believe in collaborating with teachers of the deaf hard of hearing, take them with you. Because the more informed they are, the more collaborative they are with the audiologist and what's being changed in that aid, the better they can maintain is when you are not there. It's not a piece of collaboration that I see very often, and I'm really excited about the possibility of O&M instructors sitting down with audiologists. The one we talked to, Lisa, after talking to Ruth Ann and really kind of getting the demands, the things that were important for this child, she got on the phone to two of the hearing aid brands that are most common with children, Oticon and Phonak -- Oh my gosh, how could I forget that -- Phonic Ear. Anyway, got on the phone and said, "How come -- how come this -- I've never heard this? Do you all hear this? Do you all know if you have individuals with deafblindness this is information an audiologist needs to bring up... at IEP meetings. You all need a DropBox or something, you know." I really appreciated the fact that she really got on it. Because audiologists are the ones that have to talk to the device -- people who develop devices. But you can bring some really cool new information to the audiologist. Okay. Ruth Ann do you have anything that you want to say about collaboration? Marsh: One of the things that we didn't get into the video, was that for young children some of the hearing aids you can even have a remote that can change the programs. So if a child is really too young to change the programs themselves, you can when you get out into an environment ,you can change the program for them and then they can ‑‑ they should understand that what they are hearing now is different from what they would be hearing ,or how well they would be hearing them in a different environment. Eventually you want them to learn how to change programs and use the device to its fullest extent on their own. But if they are very young, they are not going to be able to do that. And I have been seeing -- just recently I saw a little girl who was six months old who had cochlear implants. I would think the earlier that you started working with a child that's deafblind that has cochlear implants, the sooner that they would start being able to make sense of environmental sounds and possibly develop localization skills. I would love to do a research project on how young they could learn localization on, especially with cochlear implants. Blaha: Well, that's the next thing that we're going to talk about, this is a game changing, the cochlear implants are having as big of an effect on individuals with deafblindness as the rubella epidemic four years ago. I would like you to, let's watch this. Chapter 5. Cochlear Implants [ Video start: ] [ Slide start: ] [ Slide end: ] Marsh: If it's a sensorineural, is that the people that get cochlear implants? Sutherland: Yeah, that's a good question. Yeah. If you have a certain degree of sensorineural hearing loss, where we're for the seeing as much benefit as we would like with hearing aids, you might be a candidate for a cochlear implant. It would also depend not just on the hearing loss, but also kind of the ‑‑ the anatomy of the cochlea and the nerve, so that's part of the evaluation to determine if you would be a good candidate for a cochlear implant. It used to be that you had to show almost no benefit at all from hearing aids in order to be considered a candidate for a cochlear implant. But that line has changed somewhat. So we do see, now, people that show benefit with cochlear implant ‑‑ with hearing aids, pardon me, but still may be falling into that range that we would consider for a cochlear implant. The difference between a hearing aid and a cochlear implant would work, is that the hearing aid is sending that sound through that middle ear system to try to get to the cochlea, while the cochlear implant -- there's two portions to it. There's the external processor and there's the internal implanted piece and that has an electrode that actually is fed into the cochlea, so what we're doing is taking the auditory signal and we're converting it to an electrical signal that is directly stimulating the cochlea through those electrodes. And we see people be very successful with cochlear implants, as with hearing aids as well, but I think... what happens is that people frequently think that it is a fix for a hearing loss, and that once the cochlear implant is implanted that your hearing returns to normal or is restored to normal, and that's just not the case. The ‑‑ you're basically just stimulating ‑‑ well, on the most commonly used cochlear implant there are 22 electrodes, so there's 22 places on that cochlea that are being stimulated with the implant as opposed to this whole cochlea that has all of these different little discrete areas that are being stimulated. And so you're kind of taking this wider band of information and you are fitting, and you're just using what that range that the cochlear implant is ‑‑ you're just stimulating across those electrodes, in that one little portion of the cochlea. And it doesn't sound like what it sounds like when you are hearing through the cochlea directly. And there's actually stuff on the internet where you can get on and get kind of a sense of what a cochlear implant sounds like. But the best description that I keep hearing is that it sounds very mechanical sounding at first. And the interesting thing that I have also heard, is that people who I have known who have had hearing and lost their hearing and got a cochlear implant, said that it sounded odd or mechanical or like buzzing at first. And then eventually it didn't sound like that anymore. Eventually it started to sound like what they thought of, remembering as their hearing before. Marsh: Okay. What about our child that's visually impaired? So they don't have vision to see where the sound is coming from and they don't have any reference for that sound. Sutherland: I think as an excellent point both in terms of our expectations about what they're going to do with this auditory information, and also I start to think, too, about the whole process of when you get a cochlear implant, the surgery is one step. Relatively fast step compared to the next thing which is the programming of this device, because it's a computer that needs to be set, and so that audiologist that's sitting down to set the maps of the implant ,or how it's processing sound, is trying to get information from the individual on how to set all of those levels. How soft, how loud do I make all of these different things. If the ability of that person to provide that feedback is somewhat limited, then they have limited information on how to set the maps. And so ‑‑ so I think while our expectations for progress -- we might need to step back a bit and kind of follow and see what the child is doing, I think information from the family and from the O&M specialist and the teacher about how this child is responding to sound, would be really critical for the mapping audiologist to be able to use that real world information that they don't see when they are sitting in a clinic, and try to apply it to the mapping strategies and the implant to try to create a map based on as much real world information as we can. So I think, yes, step back, but also if you are not seeing the kinds of responses that you were hoping for, I think sharing that with the family and with the audiologist would help guide that fitting process. Because the programming of that cochlear implant is an ongoing thing. It's not like you get implanted and we flip it on and you go on your way. It's an ongoing process over the months and years that someone is using an implant that would change based on what's happening in the real world. [ Video end: ] Blaha: I wanted to say that in terms of cochlear implants, we looked at the census data that just came in on children with deafblindness -- the 720-ish, I think, maybe 723 something kids who are deafblind in Texas, and 201 are now implanted. And of the babies, 36, under five are implanted. That's up from 7 last year. So this is a huge number now. That are being implanted and what is happening is that if a child is perceived as having normal hearing because they're implanted, then they are not being coded deafblind. And you are not normal -- you do not have normal hearing just because you've been implanted. So it is not unlikely now that the law is out about Orientation and Mobility instructors, everybody -- a child with visual impairment being evaluated by an O&M instructor, it's not unlikely that the O&M instructor is going to be the first one that goes you know I understand this child is implanted, but I'm really concerned about what this child is hearing, and how they are doing with the therapy; which by the way -- the auditory verbal therapy that is required as a follow‑up to an implant, so you can understand what you are hearing, is very visual. And I think it would be pretty challenging for a child with visual impairments to really benefit completely from the therapy without some oversight from an educational team. [ Slide start: ] [ Slide end: ] [ Video start: ] Chapter 6. FM Systems Sutherland: FM systems were designed to help overcome two things that make listening difficult; background noise and distance. So what happens is that the teacher wears a microphone or an O&M specialist wears a microphone, that signal is sent from that microphone and transmitter directly to the student's hearing aid. It could be a little receiver coupled to the hearing aid or it could be a loop that the student wears that sends the signal to their hearing aid. There's a couple of different ways to do it. But what happens is, we keep those, typically we keep those environmental mics on because we want the students to be able to hear the other things going around and hear their own voice. But we want the primary thing to be that they heard the teacher. So in a classroom setting -- if we were using an FM system right now, it wouldn't do all that much; it's quite, we're sitting right next to each other. But in a classroom, if I was further back and there was a lot of noise, it helps tremendously. What I started wondering about... about the use of it with the O&M specialist is that, let's say we've got a setting where we set it up and we're trying to get as much environmental noise as we can, because we want that information, but we also know that's really going to make listening to speech difficult and we want the student to be able to listen to your instruction. That an FM might be ideal for that situation, because we could leave -- we could set them to a program that's bringing in all of that auditory -- all of that environmental sound, but then, when you want to give instruction, your voice comes over as the primary signal, even if you were at a distance from that student. Which would actually, maybe, with what little I know about O&M, the work that you are doing with O&M, be helpful, in that you can allow them to get a little further away from you, safely, knowing that they can still hear you, but they don't have to be necessarily within, you know, three feet of your voice or arms reach. And I could really see how that might be a beneficial thing. Marsh: That would be very useful. Initially when you are working with a young student, you are very close to them. Because you want to be able to reach out to them right away. But as they are gaining skills and confidence and stuff, you do need to back off a little, be farther away from them; not far away, but, you know, not within arm's reach. But at the same time it's really important that they can hear you if you need to give them information. So I can see that would be very useful. I've actually used walkie‑talkies somewhat that way, but it's... Sutherland: Well, I mean, if you were wanting that person to be able to respond, then the walkie‑talkie totally fits that need. But if it's more about you just giving instruction, then I think an FM would be great. Marsh: Probably be easier to use. Sutherland: Yeah, than having a big ol' walkie‑talkie, yeah. And... sometimes you can use other technology. FM is really the way to go in a classroom setting, but I'm wondering, too, about using some of the Bluetooth technology that is available with hearing aids, out with the O&M instructors. And the Bluetooth is in a lot of hearing aids now. Primarily the focus is helping people listen sometimes to the TV or to your phone or to your iPod. So there's different kind of devices that you can use to couple your hearing aid directly to another sound source. So I might be wearing a loop type thing, a Bluetooth streamer, and what is happening is ‑‑ there's a transmitter that's sending the signal like from the television set or from my phone to the streamer that's then going directly up to my hearing aids. So that's great for the computer and music and your phone and all of that, but I could also see using -- there's a fairly inexpensive little microphone that people sometimes use -- moms might use it with their kids when they are driving, the kid is in the back seat, mom is in the front seat. And it sends that signal, kind of like FM directly to the hearing aids. I could see that if the ‑‑ if the family already has this Bluetooth technology and a streamer that ‑‑ that that could also be used. Marsh: So it would be the family's responsibility to buy it? What about FM systems? If the FM system is being used in the educational setting in the classroom, the school district provides it; is that correct? Sutherland: If that student shows an educational need for the FM system, the school district or the regional day school they are part of typically provides it. It gets a little trickier if the student is being not showing an educational need for the FM, they are being very successful without the FM then, then it's harder to get the school system to want to invest in it, because they are not showing that they need to have it to have equal access. So in those situations, I ‑‑ I have seen families purchase FMs privately. I have seen service organizations step up occasionally and help. I don't -- I honestly don't know if you are showing a need for it with O&M, what would happen, I don't know. Marsh: We have a new law in Texas about the Expanded Core Curriculum and how that all children who are visually impaired have to have IEP goals that relate to that. So I think we could probably make a stronger case. O&M is one of the Expanded Core Curriculum, but again it would be really important to have a relationship with the audiologist, right? Because certainly the audiologist would have to be supportive of the child needs this in order to be able to be successful. Sutherland: Yeah. And a lot of times hearing aids aren't always FM compatible, so that would be something before the school stepped in and said you have to provide this. You would want to know about that particular hearing aid; or that audiologist might need to set a program for FM. Marsh: Uh‑huh. Sutherland: So I'm frequently in contact with audiologists in ‑‑ regarding FM system issues. Trying to remind them, I mean, it's the same, maybe the same sort of issue with FM as it might be with some of the O&M stuff, is that frequently communicating with community audiologists, the ones that work with kids a lot and understand the needs in the classroom, but some that occasionally will get a kid who has gone to an audiologist who doesn't fit kids that much. And so they're not thinking about some of the issues of in the classroom, like FM compatibility. And so it's important for me to have a relationship with that audiologist, to let them know what the needs are in the classroom just like it might be for you for outside of the classroom. [ Video end: ] Marsh: Okay. Thanks for participating. And I hope that you learned something new. I learned quite a few new things to me from the process of putting this together and working with Lisa Sutherland. It was very interesting and educational for me and I think it was for her, too. And Robbie, thanks so much for coming up with this idea and for participating. Okay. Bye‑bye. Have a good evening. [ Music ]