October 4, 2016, Location Literacy webinar. ****************DISCLAIMER!!!**************** THE FOLLOWING IS AN UNEDITED ROUGH DRAFT TRANSLATION FROM THE CART PROVIDER'S OUTPUT FILE. THIS TRANSCRIPT IS NOT VERBATIM AND HAS NOT BEEN PROOFREAD. THIS IS NOT A LEGAL DOCUMENT. THIS FILE MAY CONTAIN ERRORS. THIS TRANSCRIPT MAY NOT BE COPIED OR DISSEMINATED TO ANYONE UNLESS PERMISSION IS OBTAINED FROM THE HIRING PARTY. SOME INFORMATION CONTAINED HEREIN MAY BE WORK PRODUCT OF THE SPEAKERS AND/OR PRIVATE CONVERSATIONS AMONG PARTICIPANTS. HIRING PARTY ASSUMES ALL RESPONSIBILITY FOR SECURING PERMISSION FOR DISSEMINATION OF THIS TRANSCRIPT AND HOLDS HARMLESS TEXAS CLOSED CAPTIONING FOR ANY ERRORS IN THE TRANSCRIPT AND ANY RELEASE OF INFORMATION CONTAINED HEREIN. ***********DISCLAIMER!!!************ >> Chris: Hello and welcome to the Location Literacy webinar. Thank you very much for joining today and looking forward to sharing some information about Location Literacy. Hopefully all of you had a chance to look at the downloads that are available through the pods on the screen and you may have also received that through the email that you received a link on. So that's going to be available for download either the regular PDF or the accessible Word document. And along with that the beginning code is 151. That's 151. And then the remaining four digits of the code we'll do at the end of the program, but make sure you keep track of that one. You will be receiving an email for survey and at the end of the survey that's where you will be putting in your code to prove that you attended through the seminar. So thank you very much again for joining us and we're going to go ahead and get started. We're going to be starting with some questions and on the pods that are part of Adobe connect there's a place to enter your question if you have one. It would be great if you could put those questions out there. A lot of times it helps to create some conversation during the webinar. But just to get things started I have some questions on slide. What do we mean by literacy for orientation and mobility? And the literacy component is really the language of orientation and mobility so I'm going to assume for the sake of the webinar that most folks who are listening have gone through orientation and mobility training, have been working with students or learners for awhile and so you speak the language. So often we're doing that because we've developed skills through schooling, we've developed it through travel and the general environment, but how do we translate that into having the information available for our students? So that's what we're going to talk about that language that we're not usually referring to in that sense, the literacy that we're going to use as travelers. And where we find it. It's difficult to know what you might be thinking, so feel free to jump in in that pod so we'll have the input from the audience. So. When we're traveling in the community we have information all around us. We have information on billboards. There's information through audio sources like the radio, sometimes public address systems. We're bombarded by information through all our travels and we do some selective filtering to determine what's relevant, what isn't. How does it help us in a bookstore to hear children's music? We're probably in the children's section. That helps us to know where we are, but we have to know how to connect the information that's coming into our environment with where we are. So we're going to delve into some of those details a little bit. We're going to continue on past that question for just a moment and talk about some things that we're going to find commonly in our environment. A lot of times. >> mobility lessons will take place in malls because that's where many of our students will be spending time either as adolescents, shopping with their families or as they become older transition age they'll be looking for clothing for themselves, possibly going into an Apple store to find an iOS device, and so we need to help them develop the ability to travel independently there. In order to do that they need to be able to access that information and understand that information. So the photograph that's on the screen at the moment is the mall directory. And often they are displayed as big boxes in the middle of a walkway. Sometimes we can take a brochure with us that has a directory with us, but it's a combination of a graphic representation of the mall, usually with different colors for different areas of the store. Sometimes we have different levels that might also have different color variations. There's generally going to be a lot of print verbiage on the sign and symbols. So all those are different types of languages that we have to help students develop. Often these are not going to be introduced in their common core curriculum classes, the general curriculum. So we have to help them develop these in different settings and generalize to new settings. It isn't necessarily always the mobility specialist who will be doing this. Sometimes it's travel with parents, travel with friends, but we need to help the students be able to extract the information from their environment so they can use it for independent travel. So how does a learner access this information? Obviously if we walk up to that picture, it's displayed on the directory, it doesn't have much tactile representation to it. So if it's a nonvisual student, they might have to use something like an app. So there are apps that work both on the iOS phones or Android platform. There are some like a CNS reader that will work well and have the student save the information to refer to it later. There are other apps that real life, realtime so that you can actually hold the phone camera in front of the text and it will read to you what's there, wherever you move the camera it will be reading the next amount of text. The other thing we can do as mobility specialists is provide a student who is perhaps a low vision traveler with an enlarged copy of this map. We can download it from the Internet, they can have it printed on instead of an eight and a half by 11, maybe an 11 by 17 piece of paper. We can also have them use an iPad or other device, maybe a closedcircuit television, so they can preview and have some general sense of where they're traveling within the mall. Just giving them access to the information that is readily available to other shoppers. If this is a student who is nonvisual, we're going to work with them to find a tactile or representation of the same information. If they bring up a directory on their phone through a website, for instance, they can use the screen reader, whether it's voiceover or talk back, depending on if they're on an iPhone or Android, they can get that same information and they can scroll to the part of the store that they're looking for so if they're looking for the Apple store compared to something like Macy's department store, they can scroll to the correct letter and hear the announcement. The anchor stores are going to be their best point of reference so we can use that for interpreting where they are in the mall. The other portion that comes into this is using directory information that we also solicit from other shoppers. So it might be that we ask another shopper, could you tell me where something is? And they're using the directory to help describe it. And we have to help the learner to understand how to use the language that's being provided to them in the directions. Other signage that our students are going to be accessing are things like store signage and maps. And the next picture we're going to be bringing up is a photograph of a grocery store or super store  >> [Off mic]. >> We're going to do some description of what that might be. In a grocery store we have ceilingmounted or ceilingheld signage. We have aisle numbers. We have information about the food items or categories of food that are in each section, so along with the pasta you might also find the pasta sauce. With the cereal you might be finding things like nonperishable almond milk. We need our students to understand the information. It might be equipment like a monocular scope, it could be they're using their phone to zoom in and take a picture. Or like we described with the mall directory it could be using a text to speech app. So those are different types of ways that students can access the information in their environment. When students are getting directions from other shoppers, excuse me, we want them to be able to frame their question in a way that, excuse me, helps them to get back information that they can use. Some of our learners are very advanced in their language skills and will know how to speak the cardinal language. If they're given direction language this they would like produce on the east side of the store, if the student hasn't been given the ability to use cardinal directions, it will be very difficult for them to carry out the instructions. So if they ask is that to my right or to my left? They've provided the context within which the person providing the assistance can deliver the information that they can understand. So those are just some examples of how with signage and maps we can go about doing that. So if you'll picture for a moment in your mind a large super store, the largest I've been insofar is a super grocery store that had 60 aisles, which is a very significant amount of aisles, not likely that our students are going to be able to count all 60 aisles as they're traveling through the store so they will need other clues along the way, whether that be soliciting information from fellow shoppers or using big landmarks along the way such as once they pass the freezer or the refrigerator section, knowing that at that point they might be into an area that might have maybe things that they might use for toiletry needs. Okay. So we're going to just consider a couple of questions. How comfortable are they using adaptive equipment in public and how can we assist with that? Often bringing two devices with so that the student has a monocular telescope and the mobility specialist has another monocular telescope can make the student much more comfortable. Encouraging families to use it during regular outings so that it seems more routine and the student is more like to have practice in different environments. The use of the phone is generally something that most students feel very comfortable with because it's a mainstream tool. Some grocery stores might ask that you not photograph within their store because they're concerned about competing stores, but if you let them know that you're using it as an assistive device, it seems to be a reasonable accommodation within the ADA. We're going to talk a little bit more about stores. Sometimes you'll go into large stores that actually have grocery stores or other large department stores that have maps for shopping within that particular store. These again are things that can be downloaded, preceding lessons can be made into tactile surfaces with tools like swell paper, which is a reactive  photo reactive paper or light reactive paper that once it's printed on a laser printer you can send through a tactile image enhancer and each area that has the ink on it will puff up and that way it becomes tactile either for a nonvisual student or a visual student to confirm the information or confirm it tactilely. We also have the ability to use equipment like a CCTV or just a photocopy machine to be able to enlarge images. The nice features of the iPad, for instance, will allow you to reverse the polarity so that if there's a lot of glare on the screen the student might be able to have it in reverse polarity to be able to access the information. So we're going to move from signage at a grocery store into signage that we might find out beyond the built environment and more in the pedestrian environment. So the types of signage that are in the pedestrian environment are a combination of those that are meant for vehicular traffic as well as foes that are meant for  those that are meant for pedestrian traffic. Sometimes there's an overlap. Many of our young people are going to be going at age 16, those that have a typical developing visual pattern are generally going to be beginning into driver's training around age 15, age 16, and they're going to be learning about signage. The same should be true for students with a visual impairment. I'm going to see if  oh, we have another map back up there. So we're going to jump back a step just for a moment and talk about this is an example of the grocery store map. We have the dairy section towards the back of the store. We have the checkout stands towards the front of the store. These are tools that a student could use to orient themselves either while you're in the store or before going to the store, and also afterward to talk about the type of travel that they've had. You can talk about concepts that relate to geometry, you can talk about concepts that relate to math, so that you're also overlapping with the general curriculum in helping to support what students will be learning in their math class, in their science classes. We have many other concepts that can be reinforced in this area. We can also have students begin to play some concept games, we'll say, of where they think they would find certain items. It isn't necessarily going to be something that they can always access from the aisle signage. They'll have to use some common sense, as it were, to be able to know what's there or to speak the language of that particular shopping environment. So we'll go ahead past that second on to the next if we can. And this is just some  a collage of pictures. We have pedestrian pictures here, we have one that's mounted on an accessible pedestrian signal pole with basic instructions on how to cross the street. We have the crosswalk lines with actual directions on a stub pole for how to cross the street, when to cross the street. There's a pole with cylinders that are holding pedestrian activation buttons and each cylinder has a arrow for a parallel or perpendicular crossing. And then there's an arrow that says cross only at crosswalks. The print sign does not include appraisal. It's also probably mounted at seven or eight feet so you would have to be rather tall to even get to the Braille. The first sign, the accessible pedestrian sign, does include Braille. It's also in large print, but these are just, again, types of language that students have to develop literacy for. Some of the literacy will be written in print or Braille or both. Some of the literacy will be things that we have to interpret from the environment. So what do those crosswalk lines mean? It's something that eventually becomes intuitive to most adults because they've been exposed to it over and over and over again. On the photograph of the street crossing, we have a different pattern surface with different terrain texture for the pedestrian area within the street, but to be able to interpret that we have to speak the language of the traffic engineers who design and those builders who install it. So those are things that we have to work with our students to be able to develop an understanding. The location mobility instructor will likely be the only person, possibly the teacher of the visually impaired, who will ever be really talking to them about these surfaces. So having the opportunity to learn them, generalize them in different areas, depending on when the area was built there will be different surfaces and textures that were employed in the ADA and building codes, and so we'll have to help them to understand that not every location will have that specific type of material, but how they can learn to generalize the type of material compared to its contrasting materials within the context of each intersection. We're going to go ahead and move on to the next picture. And we're into some transit signage. And with transit signage this is going to vary from system to system. Generally each transit system will have its own style, color coding, different representations for symbols, but there are some similarities that we can help our students to develop. The first question that's on the screen is "Is it accessible"? We have to be able to have a student who can either use a magnifying tool to get to that information, to make it large enough for them to be able to read visually if they are low vision. Or if they are nonvisual, that they can get to tactilely either through Braille or raised print if they know the print system. And does it make sense? So we're going to be looking at some other pictures as we go through the webinar today that may not make sense, either visually or tactilely, based on what the symbol representation is. So if we think about the types of symbols that we might find, that will give us an idea of how things need to make sense. So just as an example we have now another set of images on the screen. We have two restroom signs. We have the sign for the men and the sign for the women. If our students have not encountered the symbol for a handicapped image or disabled individual in a wheelchair, that might be very confusing for them. It's certainly not the traditional stick figure because of the curved wheel underneath. The two pictures that are represented here have something a little bit different than our usual stick figures where the man has two legs represented and the woman has the triangular shaped dress. These figures have a little bit different shape. So we have some students who have a little bit of  we'll say functional fixedness. They're used to one recommendation and we have to help them wherever possible for them to be flexible in their interpretation, just like going from being a route traveler to being a more dynamic traveler, we want learners to be able to have the dynamic ability to be flexible with signage and do some problem solving. Can they make sense of what they're being presented with? The way we have to do that is give them multiple presentations in multiple areas, encourage other members of the team, the family included, to be able to have opportunities to explore. So when the family goes on a vacation, whether they're at Disneyland, whether they're in New York City, they travel to a foreign country, take some time to explore these things so that they have the ability to generalize in new ways because they have more exposure, more experiential knowledge to be able to draw from. And we'll go on to another slide on symbols here. What's on the screen now is a little more complex, and this will be something that I'll give you a location where you can download this from. I know that the print on the screen is a little bit small. But this is a symbol language that might be used on maps. So many maps have some sort of standardization so that symbols that are represented on one map will be similar to another. Just like restroom signage. If you see a tent, for instance, that might be an indication that that's a camping area on a traditional map. The symbols that were represented before on the image were developed at the university of Oregon and could be considered a standard for tactile information. They were able to determine through some studies what works best for individuals who are accessing maps tactilely. This is not included in the BANA guide for tactile graphics, but is available on the Internet and again I'll show you as we move along toward the end of the webinar where you can download that for yourself if you would like. But these are images that can be added to different topics, can be used in different art programs, could be used in Microsoft Word to create graphic images and then can be printed out under that swell paper so that you can have a reproducible map for a student. So where might we use certain maps like this? It could be large college campuses. It could be state parks. It could be in areas that are for hiking trails or other locations. The Texas School for the Blind, for instance, is a large school campus where we have different types of terrain. We have large parking lot areas. We have grass. We don't have any bodies of water on campus, but we do have different terrain surfaces. So on the righthand side of this traffic we have area features. And those might be like a dot pattern or a diagonal line pattern that could indicate different types of terrain. These are again map skills that many times are taught in a geography class that we can be reinforcing through our orientation and mobility. We might be working collaborative with the geography teacher in order to help them develop maps that our students can access and then generalizing sue into our lessons so that students have the real world experience of what is it like to use a map like this to navigate? So we might be making a map of their playground area. We might be making a map of a park they visit with their family, but being able to use these and those skills will be able to advance and develop so as adults they will be able to have larger areas to be able to independently traverse across a large college campus whether it's something like the University of Texas or a smaller college. The skills will be the same, but we have to begin somewhere. So even as very young children these skills can be introduced because we need the simple ability to follow a line, to determine different textures. They could also be considered preBraille skills because the same tactile discrimination features of preBraille skills will be used in what we're going to access our map information with. So these can be represented nonvisually, through a tactile source. They can be represented to individuals who are low vision. And there are many things that are becoming reality. We have through 3D printers the ability to make tactile representation of map through computer technology that weren't available before. And in five to 10 years who knows what will be available in the technology? Presently APH is developing a system that allows with refreshable pins, refreshable like Braille, will be allowing us to access tactile information from a screen, from a display, as well as drawing and sharing on that. So all kinds of things heading into the public market for our students who will becoming the adult consumers. So we're going to go ahead and move on to the next screen. And on the screen we have some images of different types of maps. We have tactile map represented the first image on the left. And it's a floor plan of a hotel for a conference. Below that is a print fire evacuation map. Again, this would be very difficult to access nonvisuallily as well as with low vision. Even if a student were able to get very close to it, it still may not be at a size that's large enough for them to access visually. There is a permanent installation of a tactile map of the Texas school for the blind. This is basically a metal map of the campus with different area features, again, with the different terrains. And on the far right side of the screen is just an image, a very simple line drawing image, we'll say, of a clock face that also includes Braille and also includes cardinal directions in print and Braille just as a teaching tool that could be used either visually with simple print or tactilely on the raised line paper or with a raised line drawing through the tactile image enhancer so that the students can become to develop the understanding that the basic clock face is very similar to our compass face. And how they can begin to develop that. It isn't necessarily required that you use something like a tactile image enhancer. You could use a paper plate with just basic pieces of paper coming from the center to proximate a clock. And start from that. So it doesn't have to be things that are very expensive. It doesn't require a major investment. It just takes a little bit of creativity, a trip to your local hobby store like Michael's crafts, can be a way to get things that the student feels comfortable using and we can use to further develop the concepts. So we're going to go ahead to our next slide and  oops, we're going to go into some live binders here just to talk about where we can access some of this information. So what we're going to do now, this is kind of a combination for the webinar. We're going to be going into some recorded video that I'm going to describe and then after that we'll be going through some pictures and talking about what's there, but I wanted to make sure that I shared where you could access some resources for both your own understanding as well as to provide tools that you might use with students or with teens and families to help them to understand. So on the screen at the moment there's a link, which is for LiveBinder. And it's just a shortened link. It's http://bit.ly/liveBinderOM, and below that is a QR code. If you like you can scan that with your phone and it will take you to the same page. So we're going to jump into that LiveBinder now and walk you through the process of how to find some of these resources. If technology is working with us, it looks like it is, so there's two search boxes when you get to LiveBinders.com. Going up to the search box if click it it opens up both of those. The first is to search all public binders. The second is a more specific search and we'll go with the second one. It has a dropdown list and rather than the name, which would be the name of the binder, we'll choose author so you're getting the name of the author of the. Type Tabb and then when you press enter it's going to bring you to the shelf with all the live binders. Lots and lots of resources here for orientation and mobility. There is information about paperwork basics, expanded core curriculum.CV AI, all sorts of things. We're going to focus mostly on the Location Literacy on the moment. So when you click on that binder it's going to open up the binder so that you can view its contents. So there are three sections here. Off to the left side we have the table of contents with the subheadings underneath. So we have the location literacy for travel, GPS, creating accessible maps. Over on the righthand side it lets you know what's inside those subcategories, so it lists each of the components. So literacy for everyday travel, for instance, has Location Literacy and O&M, the clock face that we talked about before. You click on it, it opens the file. If you wanted to download that you can. The other sections when you click on the section heading it's going to bring up just a brief description of what's contained there. And on the left the tab opened up into those subparts again so that you can access each one of those. The last tab, the accessible maps, is actually a tab that goes to an entirely separate binder because there was so much information for creating accessible maps, and we'll get into that soon, but for the literacy for everyday travel there is a collection of resources that are available here. There is articles, blog posts, there are different tools like the clock face that you can use to download. We'll go into some examples here. So the first one is a post from path to literacy. And it gives you some of the examples that we've already talked about as well as some new ones. It talks about how you can access information traveling through your day. It talks about using things like the map key, and there's a link there to download that as well. And so again that's where you could access that PDF file. And up at the top that's the actual URL to be able to visit the web page. If you wanted to go to the web page itself you click that, it opens a new tab within your browser so that you're now at the actual article. You weren't viewing it within LiveBinder. It's its own tab. So we'll jump back into the own binder and choose another one here to give you an example. This is a little more research based or scholarly. This is some information about using literacy in your travel. It gives you map information, how you might tie that into some of the general curriculum that students are already using, how we can collaborate with our science teachers, our geography teachers, geology, math. This is even more academically based, building these map skills. And these are skills that are going to be appropriate for both our nonvisual students and low vision students as well as students who do not have any visual challenges. It talks about wayfinding, talks about beyond the classroom, different ideas so you will have some students who get really excited about this and they want to do some inspirational activities and it gives you some suggestions there. This one takes a little bit to download into the window because it's such a big file. This has more to do with the signage, but there's a complete collection of signage here so this is the standard highway signs, and so if you want to have  if you've got a rain day with your student and you want to be able to talk about different types of signage, what does it mean in different environments, how can we use it? What does it relate to? This is going to help you to go through each and every kind. We'll talk about the different color patterns, why they're used, where they're used, when they're used. And you can talk about that with students if they have questions about things, for instance, why would a visually impaired person sign be used in one area and not in another? You can find some of that information there. This is the tactile library website. All sorts of resources here that are available for free. This is from different sources. This is GPS 101, so we're getting into a little bit more technical idea of Location Literacy and understanding where you are in terms of latitude and longitude, whether it be in a digital form or hours, minutes and seconds. It can be conveyed in different ways. This is a wonderful resource for the mobility specialist to be able to develop some of their own knowledge so that when students ask questions you have it or you can discover it together. This is a helpful grounding point to be able to have to begin to introduce those skills into the lesson with the student as well as GPS travel for the apps on their phone. Okay. So again, just going back through some of these materials, you might need to develop that idea of what is latitude and what is longitude, how are we going to do that? We have different things that we can make. These are just styrofoam balls from Michaels with rubber bands that go around the balls to be able to make them into a tactile representation. So jumping back towards the creating those accessible maps and the different LiveBinders, we're going to go ahead and open that LiveBinder. This again is just a collection of tools and resources so that you have those available. Very fortunate that some places have a collection of tools that you can utilize, whether it be a 3D printer or tactile image enhancer. They're not all required. You can find different ways of producing and making those materials with nonexpensive items. And so that's our video component. We're going to talk a little bit more about what you might find in the world of travel with your student. So we're going to take some time and go through some pictures and we're going to talk about what they are and what you might use in terms of literacy for each one. So I guess as a little bit of a rhetorical question we have on the screen a photograph of an elevator control panel. So there's literacy that's here, but what types of literacy can you find on this particular elevator control panel? Probably you've all noticed that there's letters. So we have print generally in newer buildings we're going to be finding the Braille. We also have symbols. We have the symbols for the star for the main floor, the bell to be able to open and close the door. We have different symbols. The pictures aren't always going to be the same as the word or the word in Braille for that matter. So we have to help students to understand that there could be three different scripters that all basically mean the same thing. So often there will be the number 1 with a star that visually might mean the main floor. The Braille might actually say main. It might say 1. So those are things that going through with our student and doing a little bit of discovery, allowing time within our lessons to be able to explore the environment, how we can use that? This is all again a component of literacy. With state testing many times schools are asking why do you need to take the student off campus when what they need is to be developing their academic skills? This is another way that your developing their academic skills in a real world setting. They're having the opportunity to experience new things, to be able to apply the things. They're more likely to be able to recall and utilize them during the testing. So when we have our IEP meetings it's important  or during your consultation time, to convey to your teams these are the things happening during our lessons. If you have a video release or photographic release, you can bring the camera to be able to demonstrate both to educational team members as well as to the family what's happening on your lessons, how are they able to utilize some of the skills that they're learning in school? When do you encounter Braille for mobility so that TVI might be saying how does that reinforce what I'm doing trying to reinforce new contractions? You can experience where you're experiencing grade two Braille in a community. So we'll jump on the next picture and we just have a simple image of a print compass. We have some students who are low vision who will be able to access directory information  cardinal direction information from a print compass. That's going to be what's most appropriate for them. We have other students who might find that that's just not something that they can access. So we have other options that we'll get to as we move along in the photographs, but just keeping in mind that we have to think about what's going to help that student in different settings. Obviously there are going to be some scenarios when the compass would be very difficult for them to use. We have some indoor environments that have so much electromagnetic feedback that it's difficult to get an accurate reading. We have the paper plate example of how we can make a simple compass for very young student just to develop those basic ideas. We've we'll jump on to our next picture here. So this is just the example of accessible compasses, but they have their own language. So the auditory compass and also the Braille compass, Braille compass are a little harder to find at the moment. Hopefully the manufacturer will begin producing them again or a new manufacturer will step up. But the Braille compass you do not have to know how to read Braille to be able to utilize the compass. All that's required is that the student be able to feel where the arrow is. As long as they know where the arrow is, they know where north is. But they have to be able to speak the language of their location in order to understand that north and south are opposites, east and west are opposites, so on and so forth. The auditory compass, not only do they have to have to know how to hold it in a traditional fashion, they have to be able to understand what it's saying. They have to be able to have those auditory skills to be able to listen carefully enough or keep it near enough to be able to hear it in different environments. It can be very difficult to use. For instance, on a noisy bus. But teaching them how to interpret their environment with the use of the sun and different environmental inputs, can also be very helpful to confirm what they believe they've heard from the auditory compass. And we'll jump into our next one. And then once students have that basic understanding we're going to be moving into new types of information. We have the Trekker Breeze, now the Trekker Breeze +, which is going to be providing all sorts of information about their environment. This is a tool that students can use even though they may have a smartphone, the battery life of a handheld exclusive device for navigation can be a big improvement to someone's independence in the community. For students who are having a difficult time using an iOS device for navigation, you might find that as soon as they step on a bus and turn on this device, they're able to get information about every single street that they're crossing without having to do anything but move the switch to "On". And that can be enough for their awareness of their location, awareness of when to pull the cord to exit the bus, which direction they're heading, all sorts of different information. We're going to be moving into some pictures that talk about other things like the clock face. So we need to think about in terms of time, in terms of cardinal directions, when is that taught? There are things that seem to be disappearing from our general curriculum, some schools are doing away with cursive writing, for instance. The idea of telling time through an analog clock is not always in the community. Many people are taking their phones and put into a pocket or purse or backpack, so it's kind of a lost art that can be very helpful for our students. Not only because it can be a very precise way of orienting to their environment, but also when people provide directions for them in the community, they might be providing the directions in terms of the clock face. So our students have to learn that language and need to be able to interpret if someone tells me to head towards 11:00, what does that mean on an analog clock face? If they've never been exposed to that and the teachers are indicating there's not time to do that because they have to move on to other activities then that's a locational mobility area that can be addressed during lessons. As well as with the compassion, if they're not using a compass, having that general understanding of the north, south, east and west and the relation of those cardinal directions absolute rather than left and right. It won't matter which direction you're facing, north is still north. So having that as an option for students is very important. We'll jump to the next picture. An example of an inexpensive way to have representation of a compass. This is just a pie plate with a magnetic arrow cut out of magnetic sheet. Doesn't have to be anything fancy, but it can make for an example for students who need something larger than a small compass that fits in the palm of 13 now they can begin to hold it in two hands and begin to explore and talk about which way would north be? Can we turn to east, turn to west, so they begin to understand the interconnected relationships of those directions. On this screen we have a series of apps. This is just a picture of a folder from an iPhone. Within this folder we have apps that are meant for GPS, we have blind square, which is something that many people use who are visually impaired. Seeing eye GPS. There's different transit apps. Each one has its own language. Each one has its own way of referring to the location of things. So some apps, whether it's those that are built in to the phone, such as Apple maps or GoogleMaps, will announce different streets in different ways. So if you have an abbreviation of DR, some apps will pronounce that doctor. Others will pronounce it drive, so we have to help students to have fault tolerance, as it were, the ability to understand or correct what the app has said so that they can make sense of that information. That's helpful to do at least initially during fun time so that when students are under pressure, for instance, they're traveling independently for the first time, we don't want them to have to be worrying about interpreting language. That can be done in the preceding lessons that they have that ability already. The transit information apps have to use multiple forms of language within orientation and mobility. There are bus routes, different systems use different terms for how they will indicate a northsouth route. Sometimes it's even and odd numbers, sometimes it's letter combinations. Some systems will use different number combinations for local routes or for express routes or for what might be termed a flyer route, those that have minimal stops. There will be different designations for rail stations than there would be for a bus station so we have to help our students begin to understand what is the representation of that both in the print transit information as well as the spoken information so that if our student is using an accessible system like voiceover to get the information from the app, how will they interpret that. So we'll go ahead and jump to our next slide. So on the screen at the moment we have a compass from the iPhone. And though the text on the screen here with the captioning is covering it up a little bit, at the very bottom of our picture is the latitude and the longitude expressed in degrees, minutes and seconds. So that's something that our students might hear and it might not make sense. So if a student asks or if we ask our student what does that mean, we have to help them to understand that that represents a pinpoint in the ground, so to speak, of where we are and how that changes. If we walk 150 feet, which numbers change, when do they change, how far do we have to be moving for it to notice a change? Many times our students will believe that apps can get us exactly where we want to be, and talking about this in some depth, helping them to understand the language of GPS and latitude and longitude, their coordinates will be able to help them to understand what changes they make in the listing of their present location. So just above the latitude and longitude that are now on the screen, we have the degree reference, which is different than north, south, east and west. And the subpoints in between. So north northeast is going to have a much different representation when we talk about it in a numerical sense. Or south. So if we asked our student to turn to 180 degrees, would they know what that meant? If we asked our student to make a 180degree turn would they know what that meant? All these concepts again are relating back to science and their geometry, different math concepts. So we're working parallel with what's being introduced for our students in their general curriculum classes. So when we think about literacy it isn't typically tied to orientation and mobility, but there is a language that they're using all the time and we can help them to continue to develop that and it will again apply toward their state testing so that we can help the teachers, the family, the administrators to know that these are things that all are going into that. Many times parents and teachers and administrators are very concerned about the state testing, S.A.T. test, getting into college, helping the team to understand that what we're going is reinforcing those concepts, as well as what students will need once they go to college are very important. That our job as salespeople to be able to help them to understand what skills they're developing in orientation and mobility. Many administrators are still very unaware of what orientation and mobility is. They might know that you're part of special education, they may know you're part of the visual field, but beyond that they're not really sure. We'll jump to our next slide here. Again, this is just another image to represent time. So this is a picture of an Apple watch clock face, which could be digital, could be analog, but to help our students to know that if they're going to head towards 1:00, 11:00, which way they're going to turn from 12 is very important. This might even be reinforced while you're standing at a street crossing ready to do your alignment for your crossing. We're going to have after easier time describing to the student that they want to aim towards 1:00 than taking their shoulders and physically motoring or moving them into that position. So when we can help them have that sense of orientation or sense of space to know that they're turning in a 360degree pattern or if need be 1 through 12, that gives them some sense of knowing where they are within that circle. Jump to our next slide, please. Thank you. So this is just a picture of a globe of the earth. Again, for most of our students this is not going to be something that they're going to be able to access any more than it's a sphere. It's just a ball. So if we are able to utilize something that is tactile they are able to use that to represent the lines of latitude and longitude, the lines of the globe that are on the screen, and to talk about what's above the equator, what's below the equator, where do the lines of longitude change depending on where they are on earth. We'll jump to the next slide. These are just examples of things you could use to have a tactile representation of this. When we talk with students about what latitude and longitude really are, when they first hear them they're just numbers, but when we can help them to make sense of what they are, whether it's starting with something that may be a little more simple for them, like the yardage across the football field and we have a tactile representation of any kind of a court or a football field, they begin to get that sense of where the 50yard line is compared to the 20yard line. And then we can generalize that to something a little more complex like our lines of latitude. And moving in then to our lines of longitude, and we can talk about combined where different spots are, excuse me, on the earth, talk about what's above the equator, what's below the equator. We can put a pin where they are and talk about how something can be on the total opposite side of the world. That would be difficult again for a student to conceptualize when we're using a traditional globe, but for less than five dollars, by getting a foam ball at Michael's and some rubber bands and pins, we're able to help them develop that sense of space. Many of our students will become world travelers. Many of our students have come from other places in the world so we can help them to understand those geographic locations, help them to speak the language of planets, rather than just here and there. Okay. This is on the screen an example of a teaching tool from APH. This is tactile town, set up to represent  this is kind of a large football field area at a high school compass with a road that  high school campus with a road that goes around the perimeter. These again are things that we can talk about positions in space. We can use a tactile surface, a tactile model to be able to begin to develop what's parallel, what's perpendicular, what's near, what's far. They are all useful language for describing our position. We have to help our students to be able to move from what many people begin with, such as "It's over here, it's over there", to be able to utilize words more precisely so that they can accurately describe, but also accurately request information from the general public. This is another example of an intersection using tactile town. We have a slip lane or right hand turn lane, sometimes called the pork chop island there. Those would be very difficult concepts to describe verbally to some students so having that tactile representation can help them to take the words and begin to understand how the intersection is shaped like a box. To help them to understand where they are in a corner. To understand that the northwest corner of the intersection also happens to be the southeast corner of the block, something like that, so that they understand that depending upon your point of reference the way that you describe something could be two totally opposite things. And so using different manipulatives allow our students to have the opportunity to develop those concepts. And then to generalize them into the real world. On the screen now we have a map. This is again another image of a tactile map that was generated originally on a computer, basically something like Microsoft Art you could use. This is actually more of a graphic program, but basically you're drawing lines or circles or ovals and adding in Braille that you can use as a separate font and you can use a keyboard to separate that. You don't need to have a Perkins Brailler or an embosser. But how we represent things on the map, for instance, on this map there are some areas where, excuse me, are four or five stairs, so those parallel lines might represent stairs. This can go back to using the image that we had seen before from the university of Oregon for the symbols that can be map symbols. On this particular map we can have labels for what the map is for. We might be thinking about using a map key so that we could have those symbols described. It could be in the same page, it could be on a separate page. But again, how could we represent what's provided for in print or visually in a tactile sense? We also need to develop the student's ability to understand that information. This is again another image of a map. We have a directory on the righthand side of the map that has both print and Braille so that the information is the same whether it's being read in print or in Braille. On the bottom left of the screen we have a key and there's a clear division with two bold lines that go along the key to divide the map key information from the map information itself so that a person isn't confusing what's in the key with what's actually on the map. There are streets that are labeled. There are building outlines that are there. There are different terrain features such as grass or parking lot area. There are actually individual parking spots that are represented on the map so that if you were talking with a student about how big a parking lot was or what a parking spot is, those are things that are kind of difficult to describe unless we're down on the ground feeling the actual painted surface. But this gives an idea for a student that there are kind of different cubicles, for lack of a better word, that the cars pull into, and that might help them to finally connect that concept of what's happening when we talk about a parking spot because they traditionally aren't going to be able to feel the spot where cars are parking. Okay. And so again, this is our feature  we'll go a little bit more indepth here. The point features are features that are going to describe specific places. So we might have some outlines of buildings, just as a general reference. We might have elevation changes. So often if we want to know where we are, we know where we are by our altitude. Those are things that be represented generally by apps on the phone, for instance, or represented on a handheld device such as some of our GPS equipment. We have intersection features, so on a map we have a circle that represents no traffic control. If we wanted to know which intersections have a fourway traffic control that's represented basically with a plus inside the circle or a oneway control that's  following the center line. And the oneway control  I'm having a harder time reading that print. Basically we can indicate what type of traffic control is within the intersection so that if we have our traffic control on the northsouth street or traffic control on the eastwest street, that would allow us to be able to differentiate that on the map. So if you were learning a new downtown area because the student was going to be traveling there for a summer program, you could talk with them about the alternating traffic controls so that we have in this residential environment streets that are northsouth begin every other one having the stop sign, and eastwest the opposite. When we think about different stairs, different travel environments, we're going to have different representations for that as well as we looked at on the hotel. The next column over is line features  oops, if we could jump back for a second. I don't know if that's possible. Thank you. We want to talk about the streets, whether they're twoway streets, oneway streets. If it's a fence, if it's a railroad, those are things that have to be known ahead of time so that our student can correctly understand what's being presented in that information. Again, this relates to the state testing because our students have to be able to work with keys of information to be  whether it's a math problem, a social studies question that talks about different terrains, different areas, different communities, and understand what that graphic information is so we are supporting all those different areas when we're working with our students on these topics as well. And now we'll jump to the next slide. Thank you. So this is one tool that you could use to measure or get a little more information about a tactile map. Sometimes what we're using is  this is called a Braille caliper. Traditionally when we're feeling textured surfaces on our paper we can use only a certain distance apart, otherwise our fingers won't be able to discern one dot or one line from another. What the tactile caliper allows us to do is get even closer, so this allows us to have a measurement down to 116th of an inch because as you open and close the caliper the Braille pattern will change in each of those 16th of an inch so that we can get a real sense of which building is larger or how far from one building to another are they in relation to one another. And this is available for, I believe, about $20. And in the LiveBinder as well as some of the blog posts from paths to literacy, there are links for getting to the national Braille press where this can be purchased from. And so there's all sorts of options that are available today above and beyond our traditional ruler that has some tactile markings on it. And we'll jump to the next slide is when you're ready there. This is just a very close picture of the swell paper, which is that specialized paper that when run through the tactile image enhancer puffs up, so that we would have that sense of what it would feel like under our hands to feel the raised line, the dot pattern for the Braille as well as the dashed line pattern to represent a surface area. So in this particular representation, for instance, the dotted line pattern might represent a roadway where vehicles with a smooth area might represent a pedestrian walkway. Okay. So now we have the picture of a grocery store or a Walmart, different types of areas again that are going to have ceilingmounted signage. So again we talked about how would you access this information. If you were working with a student who was low vision, you might be able to bring certain magnification devices. Sometimes students feel very comfortable holding their phone up and using the camera. They can use the video portion of it, pause the video, bring it in. They can SNAP a picture of it and zoom from there. There's some students who would be able to use a text to speech app to be able to basically photograph what's there and the app itself would read the text that's on the screen. Those are things that we can work with our students on being able to use the assistive technology. Again, there's a lot of overlapping expanded core curriculum areas here, but we also want them to be able to understand the basic concepts of odd and even, numbers getting bigger, numbers getting smaller. Where are we in that general sense so that we can begin to orient ourselves with the language of direction based on the numbers of the aisles. So that if our student with low vision doesn't have any device with them, but they're able to discern the large numbers, we can work with them on understanding their relative position in the store based on the numbers. This is another picture of our mall directory. Again, helping them understand that the categories are based on different types of stores. These are alphabetical. Again, we're reinforcing those basic literacy skills that they might be learning in English class. There's also codes and finding codes on a map. That's going to be overlapping with lots of their other science classes. If they're having a difficult time with colors they might be able to use the letter instead of the color or vice versa. So if we have a student who is colorblind or who is monochromatic, they might be able to use different types of information to get the same answer. On the bottom right of our mall directory we have a collection of symbols. So just think for a moment if you can of the types of symbols that might be represented there and which might make sense to your students and which might be confusing. So restroom signage might be the stick figures, but then we have things like an elevator or an escalator. Will they understand the difference between a stair symbol and an escalator symbol? Will they understand the difference between an ATM and information symbol? So the whole idea of using symbols becomes an enormous concept all on its own. And when they might experience those symbols. So all along the walkway of the map on both floors there are symbols for different areas within the mall, whether it's the kiosk with the separate stores, or for where they might find the information desk, the CocaCola vending machine, the restroom, all sorts of different things along the way. And we'll jump to the next one here. This is an example of the  fare box at the metro rail station in Austin, Texas. This particular box has lots of different ways of accessing the information. And there is a lot of information available. Just from a distance view here, there's ticket signage above that's in print. We have a lot of written words that are on the fare machine. We have some symbols such as the dollar sign for the money. We have the symbol with the pedestrian in the wheelchair to indicate that it's accessible. We have the symbols for the credit card on the righthand side of the machine, all sorts of different information available for us. And if we jump to our next slide, this is a closeup of where the individual for the transit would make their purchase. So they're following along in order of the arrows that go from audio to cancel to coins to cards to pin. And down at the bottom on the right, the ECC or the electronic change card. And so that is meant to be understood that you're flowing in a pattern from the left to the right. That's something that would have to be interpreted. It's language of its own. We have the numbers represented on the left and on the right side of the screen that are there with raised print as well as Braille. This would take some instruction m students might come to it very intuitively, others might need a lot of assistance with this. There are voiceguided prompts. If we have a student with hearing loss that might not be an option for them to be able to understand, so helping them to be able to interpret that tactile information can be very, very, very important. So along the street, along the sidewalk we have lots of signage, but are our students able to access this information? Or if they know that it's there but they're not able to visually discern the letters because of awe sky thank you, how can they access that information? How is it relevant to them? We also have symbols. We have a symbol on the bottom right of the sign we have a symbol for the bus, we have a symbol for the train. Which for those individuals who are visual will give them very clear information that Highland station is meant for both bus and rail. If our students are nonvisual, that's information that they don't have available to them. Using an application like the transit company's app, it can identify where they are and provide that auditorily or through a representation on their screen, a refreshable Braille display if they have it connected to their iPhone, for instance. So this is different types of information that we need to help our students understand how to become comfortable with and how to access if they need to access it in a nontraditional manner. So this is another type of transit station that we have. This is sunshine station. And this particular station in reality is an accessible station because as you approach the station it has a button and when you push the button, just like an accessible pedestrian button, it will provide information to you about when the next bus is arriving as well as the direction of travel. So you would be receiving the same information auditorily that other individuals might be receiving visually. Okay. This is on the bus. This again is not something that's provided tactilely. And unless you were using an app to do text to speech it would be very difficult to know what this information was unless you were soliciting from another passenger. But just to let folks know who the seats at the front of the bus are intended for or should be reserved for if someone boards. And banner information, this is our metro rail train in Austin, Texas. This is heading downtown. I don't know that many individuals that would want to get that close to the train to read that sign, so having that monocular telescope available, this train is shaped identically  sorry, it's shaped the same from either independent. It's kind of a pushme, pullme kind of design. So knowing just even the basic shape of the train isn't going to give you information about which way it's headed. It would be that signage on the front. So utilizing some other source of information, whether it were pushing the button at that station to hear where it was headed or using an app to be able to read the signage that was available on the front with a text to speech would be the way to go in that case. This is again the printed information. Would be very challenging to get this information even with a handheld magnifier. So researching this beforehand or having that app read the text to speech would be very helpful. For our students to know whether it's safe to cross at a certain intersection, when it's safe to cross at the moment, there isn't any reliable way to know if you're in a city that has an exclusive pedestrian phase when the cars do not travel, only pedestrians are traveling or if there's a lead pedestrian interval where pedestrians get to go first, that information might be provided visually, but not auditorily, not tactilely. So having other ways to access that information is very, very helpful. And helping our students to know that just because you're at an intersection it may not be designed the way that the previous intersection was designed. This was just a fun sign, but again, who would this be intended for? Would our pedestrian know that the cars might not be able to see them from each particular area? Going through some of that signage information with your students, those that are low vision or those that are using a text to speech app that know that there's a sign would be able to get some of that information to know that they might need to be more of a defensive pedestrian, for lack of a better phrase there, to know that the drivers will not see them as well because of obstructed views. It might not be that our pedestrians know that a particular crossing is a new crossing and that drivers might not be expecting that there's now a pedestrian crossing. So this information is made available to drivers. Whenever we can help our students to understand the drivers' information, it gives them one extra layer of safety to be able to keep themselves going where they want to go in a reasonable manner. So this is an example of the pedestrian hybrid beacon, sometimes called a hawk signal. And most drivers who have been licensed for many years will not be going back to the department of motor vehicles or public safety to learn new types of traffic signals, new types of intersections. So we hope that there are public information campaigns that occur, but that isn't always the case. So these are things that people have to either learn on the fly, reading the sign quickly as they drive under it, or just through repeated exposure. For pedestrians they might not be aware that drivers don't know about all the new signal types coming out so there's a language to these new signals. This is also a language that our pedestrians, our students can learn as well so that whether they're traveling with their families, traveling on a bus or as a pedestrian, can anticipate at least in some way what they expect the drivers to be doing at each particular intersection. This is another form of an accessible pedestrian signal. They have to know what the arrow, the raised arrow in this case, is indicating. And that they're actually going to be pushing the button below the arrow rather than the arrow itself. Again, we're talking about cylinders for the accessible pedestrian signals, can they interpret the arrows and which direction they're representing? Sometimes the arrows have been removed through weathering or maybe somebody picked them off. So we have to use some of our problemsolving skills to be able to interpret which way those particular buttons are accessing their streets. And this one was just for humor. This is a pedestrian with wings. I can only imagine we hope that everyone is able to safely cross at any intersection. And again, sometimes our information from our environment, the language that it's speaking, won't be in print, won't be auditory. We have to interpret from tactile information can esthetic propose proceed receptive, what is intended to be relayed at that intersection. So we have our truncated domes with a ramp. Busy corner. We're just about done with our time here so I will just have us move along and I'm going to get into our closing code here. And thank you again, everyone, for joining us today. And we'll hopefully have this recorded and up on the website for you. And our closing code is 006. Again, the closing code, 006. Thank you very much, everyone. Have a wonderful day and I hope you get to practice some literacy with your location understanding. Thank you.