TRANSCRIPT Virtual Reality for Visual Impairments: A Study of Object Perception 10/27/25 >>Donna: This last May I completed my research for my doctoral dissertation in virtual reality. My dissertation is in virtual reality but my study is in special education. So what I was looking at was whether or not VR was feasible for our students to use for visual learning. So we're looking at students with low vision and seeing if we couldn't maybe mitigate some of the issues we were having with visual learning. Small detail, intermediate and distance learning that those concepts were rather difficult. So this study just was to give us a baseline of whether or not we could foster inclusivity for students with visual impairments and maybe enhance some spatial awareness. There's been a lot of research in virtual reality for the use of cane and cane travel and training of cane travel. But not a whole lot in actual concept development, specifically for students with visual impairments. There's quite a bit on students with ASD and other categories of disabilities. But not necessarily students with visual impairments. So I wanted to see where we could go with that. What sparked my interest in this is I worked with student who wanted a very expensive magnification device, wearable magnification device, to be specific. And the product designer said, no, their acuity is outside of our recommended success rate. I was like I'm not so sure about that but let's give it a try. The magnification device was built like a virtual reality headset and so my student and I trained on a VR headset for a while to see if we could make it work. And it did work for her and she was able to then receive the device and actually perform at a higher rate than what was expected, based on just her acuity. Theoretically in a historical framework. We do have a shortage of trained professionals to offer direct services to our students. So I wanted to also look into is this a feasible intervention to support practice between visits from our teachers for the visually impaired. We have seen a historical development of educational games since the beginning of computers in schools. From those of us in our Gen-X mode where we played Oregon Trail in elementary school. Those games started and we've just taken off from there in how we use computers within schools for educational purposes. So does VR have some of the same potential as other games did. VR does have the ability to give feedback and practice that maybe we could not have before. There are newer studies out now with vision and haptics as well as olfactory where we're adding smells to the mix, which is really pretty interesting. So the purpose of my study really was to examine the potential of VR. And whether or not our students could use VR. If you don't know much about VR, the headsets you wear, as in some of the pictures we've got going here. And this cutie patooty with the big ole headset on. The screen that projects the image is very close to the eye. Usually about where your glasses sit on your face. And if any of you have worked with a student with low vision, you know that we have some kiddos that put their nose on their items to see them. So I was thinking, hmm, if our screen is already up here where some of our students are putting their items, could we use that in a more positive way than having ink on our nose to portray or to convey information. So that really started my research question. Did it improve? Was there a -- or did the perception in concept development improve when using VR? Did the visual acuity have a limit? Was VR able to be used with students who had a higher acuity loss? Did a field restriction impact? And what was the improvements? Decrease or no change in the use of VR in the actual concept perceptions? So we won't do a whole lot of this. This is the really down and dirty what kind of method did I use and what was the setup. The basics are the students viewed an in-person natural environment. So the student was seated in a chair represented by this green circle. And then they viewed items placed on a presentation board, like the one in the center on the bottom there, at 6-foot, 8-foot, and 10 feet. So we were able to get a baseline of where their natural acuity lay. This was in a classroom-type setting, so it was classroom lighting, not clinical lighting. What we controlled was the external output -- input and output. So there was no extraneous sounds, noise, or anything that would interfere with what we were doing. These other two pictures on the side are examples of the items we used so that we could get a baseline acuity. In the virtual reality environment, I produced the same type of activity. I used the HTC Vive headset. It's an older headset but it had a much better resolution and it was easier to work with. We had a tripod set at 16 feet with transmitters and I used a Dell gaming laptop with a Ryzen 5 and a beef-up processor. All that information is in your handout, if you want to check it out. I used a virtual environment that allowed me to virtually measure our 6, 8, and 10 feet. On the floor was this great scale that I could have the students stand on. And as they looked down in the environment, they could actually step forward or back and it made them feel like they were actually moving around in the environment. So we asked them to repeat the same type of activity at a virtual 6, 8, and 10 feet. Inside that virtual environment, we had what we're used to seeing. So we were able to allow the student to stand at their 6, 8, and 10 feet, which did change the virtual position. So it did feel like they were moving. But the projected image was still in that same location in the headset. And I'm not seeing any questions, so we're going to just jump on along and look at what happened during the study. So our participants in this study were students 8 to 17 years old, identified as students with visual impairments with an acuity of 20/70 or greater following their corrective measures. We had five participants. So it was a small data set. So if you know anything about research, we were not able to have a stated statistical significance because of the small number of the sample. So we're looking at expanding that sample size. So each of our participants did give us some data on how the VR worked. We had one participant who refused to do the VR portion, or at least in part of the portion. Our participant 1 was a male student who was on or near grade level and had a natural perception or natural acuity -- again, not clinical acuity. This was something we would get like in our FVE, LMA. And I just noticed I have a typo on this slide. It was a 2600 acuity on his natural perception. And a 2700 at 8 feet and a 2700 at 20 feet. Once we got him under VR, this is what really surprised me. I thought there was going to be a little bit of an improvement. That was my hypothesis. But I found that at 6 feet, he was able to read the 20/70 line. Whereas before he was at the 2200 or the 2600 distance. And at 8 feet, where he was 2700 in the natural, under virtual he was 2200. We still did have an increase at both the 8 and 10 feet. In the satisfaction survey, because we did do a survey afterwards to see if they liked doing it, and they did. The students -- most of the students did enjoy being under the VR and wanted to play with it more and did find that they could interact with the items more. Our second student was again a male student, middle elementary age and on or near grade level. Again, this is not his clinical acuity. This student specifically was a student with CVI, so we were able to get his natural acuity being 20/25, 20/20, and 20/30. I was thinking we're not going to get much improvement there but we did. Under VR he was 20/10, 20/05, and 20/15. So he was still able to improve the detail he was able to see under those items. I did see a question come up. Is the acuity -- it is natural using their lenses and all of the lenses did fit in the VR. So they were wearing their lenses under the VR headset, which is one of the reasons why I chose to use that VR headset because you could wear your glasses underneath them. There are some headsets that have prescriptives that you put in. Each of the ones, the HTC Vive and one other do have prescription lenses that you can put in. And they're rather inexpensive. I know -- I don't know who's in right now. I have heard from a couple of folks that the Apple -- new Apple ones, specifically the new Apple VR headset, does not work as well with our students and their acuity adjustments as, say, the HTC does. It has an up to 5 -- I believe a 5 diopter adjustment. Great question. I have pretty thick glasses and I'm able to wear them under the HTC, because it is a little bit larger of a headset. Great question. Our third student, another male student, high school, was functioning at the mid-elementary level. And we got great results in the natural environment with him being that's 20/300 across the board. He was not too receptive under VR, I think with functioning level and more exposure we could have gotten more data from him. But we were able to get that 10-foot measurement. And he was doing a 20/300 at 10 feet naturally and under VR we were at a 20/40 perception level. So we definitely got some improvements on that one. Our fourth participant was a high school female who was functioning in about the upper elementary level. She was 20/60 at 6-foot. 20/100 at 8-foot, and 20/200 at 10-foot. She thought the whole process was lovely. It was great. She loved getting into the VR and really exploring. And you'll notice on her 6-foot she went from a 20/60 to a 20/05 perception. She was able to drill down on small details. At 8-foot, 20/100 was a 20/10. At 10-foot, 20/200 went to 20/15. We were getting well above a 20/20 acuity perception for her pretty much across the board. And our last participant, we did not get any response. He was on the edge of our functioning level for the study. And participated beautifully in the natural measurements but we got the VR headset on and he was having none of it. So we had to end our session. So, unfortunately, I have no data on that student. But across the board, all of our participants who completed the VR segment showed significant -- unfortunately small because of the data set -- but did show an increase in ease and clarity and immersion. They really did love using it, most of them. And wanted to have it more. They did complain because the HTC is a tethered VR device. That means it has a cord running from the headset to the computer. It's a bit more powerful because we can have it connected. They complained a little bit about the weight and the pull of the cord. So that's something that we would possibly change in the future. So we had our research questions and our data suggests that VR technology could enhance the object perception of our students. So if they're doing, say, a science experiment or a social studies experiment where they've got the intermediate and distance concepts. Like my student, before this study who got to experience a snowcap. She completely had the vocabulary to speak on a snowcap and knew it was winter and at the top of the mountain but didn't really have the visual concept of a snowcap being the distance. Under VR, was able to express, oh, okay. That's what a snowcap is. So it's those varying distances past the handheld that this possibly could help our students. Unfortunately, due to that small sample size, we couldn't get a definitive acuity threshold of where our students were. In their visual acuity it would either stop being effective or start being effective. But the overall findings indicate that there is an improvement in perception across most acuity levels, but we would need to definitely do some further research. Most of the participants praised the clarity and the immersion. So we didn't lose any interactivity with their vision loss. We did have one participant that had a field restriction and it did not affect his interaction in the VR setting. And they expressed their excitement on finding more. They really could see more under VR and they wanted more personalized home setups. So we assumed in the study that VR would impact the perception. Our limitations were that the headset was an older version. It was tethered and there was some weight. The sample size was draw so we could not draw that statistical significance that we would like. And any further research could explore more predescriptive lenses. We were using the one headset so we could not tailor it toward each student, other than the native adjustments. So our findings did support that immersive technologies are effective in overcoming some of our sensory limitations and promoted inclusivity in those practices. Integration into education has the potential to enhance accessibility and provide students with a more equitable learning experience. And future research could allow for more confirming of that finding and exploring of scalability, once we move out into the main area. So all in all, it underscores the potential of VR has an enhanced tool for our students for practice. And I went very, very fast. We are only 30 minutes in. Do we have any questions? >>Kaycee: This is Kaycee. While people are typing their questions, I have a question. I have zero experience with VR. I've never even looked at or touched a VR headset. And I'm having trouble visualizing what it means for a student's responses at 10 feet, at 8 feet. What does that mean? Because it's so close to their face. Does it have a camera that's picking up something 10 feet away and it's giving those results? What does that mean? >>Donna: Well, it's a simulated. That's a really great question. It's a simulated distance so under the VR -- in the headset in the VR environment, things still appear, because you have perspective, since you're immersed in the environment. You have the perspective that it is that far away. Like, it's hard if you've never had a restriction of your vision, to kind of express how that looks. I mean, this is a flat picture on the screen right now. But that is simulated from this picture at 10 feet away, if we're standing on this 10-foot. So in the image, it feels, when you're in the virtual reality, because you'll hear people say, oh, I was in a roller coaster in virtual reality and you felt the movement and you felt the drop or the distance. So it kind of tricks your eye in the way that it works that you have this huge distance. Does that make sense? It's a hard concept to explain if you've never been under VR. >>Kaycee: Yeah, no. That does help. That does help. So, like, yeah. So on the screen right now is a chart that's tilted, for those who may not have visual access to it. So when you're looking at it, it looks like it's getting further and further away,. That makes sense. If you're wearing the VR, it looks like things are going away or coming closer to you or moving, but obviously everything is on the one screen close to you. >>Donna: Correct. And this is tilted the way it is because I took a screenshot when I had the headset on. That's the other thing that is really nice about VR is that I could, on the screen of my computer, know exactly what range of area the participant was looking. So how many times have TVIs have we looked at a student and had to guess where they're focusing with their eyes, either because of a divergent gaze or because we are trying to get them to focus on something that is educational and we want them to look in a certain area or to teach them how to scan, say a shelf in a grocery store. It's very difficult to know exactly where they're looking. Under VR, I can tell from my screen exactly what part of the image they're focusing on. So this particular image that's on the screen that shows a grid 1 to 10, the 1 is one feet away from the wall. Is the blue line after the 1. And then the next one is 2 feet after the wall, 3 feet after the wall. You can see a wall that's above the real virtual wording there. So that's kind of why it's tilted. I have looked down with the virtual reality and snapped a screenshot so that we could tell what it is across the item. So it looks like Katherine has seen videos of VR being used to show how someone's eyes track across a reading passage. This seems like it would be an awesome assessment tool for some students. Do I have any experience with that or do you foresee VR that's being used in our FVE/LMA kits some day. I would love if they could be a part of our daily use. There was a lot of use during the pandemic for eye exams, like clinical eye exams. And a lot of research in that area. So I see it possibly coming around to where we could use it in some of our FVE/LMA for getting data. But we have to be cautious in that this is a projected image. So the image is only from our eye, from our focal point. So the acuity we're getting from that is the reading of that inch. So it's that near, that very near perception. We are not actually getting a 6-foot distance, we are perceiving concepts that were at that 6-foot distance. Absolutely. I really hope that one day we'll have VR more accessible for our students. And let's see. Gina asks if we need specific software in order to present things in VR. It depends on what part you're doing. If you're designing, there's a lot of skill involved in designing the virtual reality environments. There are companies out now that are designing educational VR software. Such as Florio VR, who does specifically for students with ASD. They have some wonderful street crossing software that they offer or programming that they offer. And there are other companies out there that are using VR. The cost -- Sandra asked the cost of VR. And it truly ranges from your lower-functioning, there's something called cardboard VR, which is okay but not the highest functioning. It's not the best functioning. And then you've got your more expensive, like your Apple VR, which is really outside of my budget. And I used the HTC Vive, an older model, which worked wonderfully with Steam, which was no cost to get just the Steam platform. And I got a used HTV Vive for $300. So it all depends on your budget of how far you can go. The prescriptive lenses start at $20. Now your higher prescription is going to cost a little bit more. But your base prescriptives are about $19 on top of the cost of your headset. Those are great questions. I was lucky to find this group out of Mexico, the Real Virtual. I tried to design my own and it was coding through some 3D software and through another company and through a university. And the toolbox was just too much for me to learn in the small amount of time that I had to do it. So luckily I found Real Virtual who offered their testing room for free. And I was able to use it for my study. And Melanie has a question: Did any of the participants have ocular albinism? Not in this run. We did not. The Apple Vision Pro is one of the ones that -- VR headsets that we have had mixed reviews about. With our students with higher acuity losses. And, yeah. And due to that astigmatism, it was only correcting and working well for students with 20/70 or better. Anything over 20/70 was difficult. Yeah. The higher prescriptive lenses were difficult to get. The Apple one specifically has been giving us, in the field, some road blocks. You're not the first person I've heard about that one. So, yeah. That's definitely one we've run into some issues with. There are some other head set and glasses options that I'm looking into that are not necessarily VR, they're more in the augmentive reality. The name is not coming to me right now but I have a pair -- you have probably seen them all over social media. You wear the glasses and they give you a 100-inch screen and everybody is raving about them. I actually have a pair and they are really rather nice. The newer, upgraded version that blocks out all the lights. It works on the same premise as our study did with the virtual reality. And they do allow you to have 100-inch screen. They say 100-inch screen. I didn't quite get it that big or perceive it as that big. Kathi, great! She's curious about how it works with color blindness. We did not, in this run of participants, have any students with color blindness. But that is another. The difference between augmented reality and virtual reality. Another great question. Virtual reality, you are completely immersed in a simulated environment, like the environment you see on the screen. Augmentive reality, if you have ever played Pokemon Go where you can walk around and project your Pokemon all over the environment that you see, that's your augmentative reality. You're actually seeing your environment with things overlaid over the top of your environment. It's not all exclusive to a simulated world. You're actually overlaying things on top of your virtual world, so you see what's around you as well as the projected image. That's a great question. And Kathi, I would love to have somebody with color blindness report back to me what, if anything, the impact of color blindness is. Because our virtual reality headsets have some color blindness, and depending on the program you're using, you can get into their accessibility settings and adjust for color blindness. Some of the environments have the adjustments for that. That's another possibility is that our virtual reality environments actually have accessibility built into them, like captioning. There's one person that I've talked to over in Europe who is working on positional captioning under virtual reality. So as you turn towards someone -- say you're in a group of people and you have your avatars walking around and, yes, there's little avatars walking around. And you turn towards somebody that's standing in your virtual room, you would be able to get captioning of the conversation they're having. But you wouldn't be able to eavesdrop on someone else that is in the room. You would have to actually be facing them for that captioning to happen. So that's some things that are happening in the VR world. Like I said, there's some researchers doing studies on haptics. And olfactory, so we're getting smells as well as sights and vibrations. All those studies are going on right now. Kathi says she has a student with unique color challenges. One issue is that blue on white is not visible. Plus limited color vision. Yeah, that would be an interesting study to do in adjusting the color. Because that student would have an issue with this virtual setup I have right now because it is blue projected on white. Again, we're working with technology so we can very simply change colors and see if that helps. Oh, hi, Lee! Didn't have that in my report. Thank you very much for filling us in. Lee said that participant 2 is colorblind and they use VR successfully. Thanks for filling us in there, Lee. Do we have any other questions? It would be interesting to use this with students who have level 1 CVI. It would be perfectly adapted environment. Yes, absolutely. To see where the student's gaze is lingering, where it's traveling, and to be able to sanitize their environment would be pretty cool. Yeah, Kaycee, good question. Kaycee asked if we would be able to use this in realtime during a lesson? Absolutely. There are virtual reality simulations of dissections. Yes. You would have to load materials ahead of time. Yes. There would have to be something done. I adore some of our educational items like virtual dissections that would allow our students to really participate more in manipulating the dissection. Virtual science classes where maybe a microscope isn't accessible, so that they're able to see details of those experiments. Charlene commented about O&M students and prepping them for intersection crossings. Absolutely! I don't know -- let me see if I can pull up some of the intersection crossings that are available through Florio. Let me see if I can find those. I am a little partial to Florio. I don't know if I can actually pull up the lessons but I can drop this over here. You can look at, if you can see, this is their website here. This is one of their simulated virtual crosswalks. And they actually go through when we cross we look left. And when you look left in the virtual environment, it changes the environment to what you're looking left. And you're looking right. Yes. Floreovr.com. They also have some -- let me see if I can pull up some of their lessons here. Conversational skills and checking out body languages and facial expressions. Early social. This is just one example of VR that's out there. They have yoga poses. I've talked to them a lot about their grocery store and scanning shelves in the grocery store. And this is some of their safety activities with their crosswalks and verbal situations and doing a three-way stop. School readiness activities. Again, these were developed for students with ASD. But they can definitely apply to a lot of our students in practice. And, again, these are just some examples of what VR is out there. Now, these do look more on the cartoon side, but they are good exposure. Here's their scanning of shelves. So if we're teaching a student to scan shelves in a grocery store, they've paired it down to just a couple of items. And you can really tell because you're looking, in this case it runs on the Apple platform. You're looking from an iPad and you can see exactly what the student is scanning and in what direction they're scanning and what items they're focusing on. So on the vision side, it does wonderful for us to be able to tell exactly what they're focusing on and what they're looking at. And we're not having to guess. And it gives them good practice. Again, wonderful questions. So there's a lot out there that is being developed right now. I agree, Kaycee. So many possibilities. And we have our kids that are nervous to do things. You know, nervous to do those street crossings. Nervous to go to the grocery store for the first time. And they have that anxiety, so why not practice in something that's immersive that gives them the confidence then, okay. I've learned this skill. Let's try it for real. This is not a replacement for our instruction or for community-based instruction. This is just a supplemental practice to build their confidence. And maybe to even give them access to things that they didn't have before. Like the magnification devices on a microscope. Or a dissection experiment. Or even social studies. When we're looking at land masses and distance concepts for our social sciences. So there's a lot of ways we could apply this. Do we have any more questions? Is there anything in augmentative reality that has a camera and pulls the environment into that super close screen? Thinking like scanning the real shelves. The only thing we have right now that would do that is really our wearable magnification devices. Our augmentative really wouldn't do that. And VR, again, is just a simulated that we could practice on. But we do have some wearable magnification devices. A lot of our kids don't want to use them. And a lot of our wearables have moved off with the implementation of AI into our wearables. They're doing more of an interpretation than a visual projection. So, unfortunately, we've moved away in the augmentative from that. Katherine asked does it make a difference farsighted and nearsighted? Not that we've seen yet because it is literally the projection is right here. So we haven't noticed any difference yet. Again, we're still studying to find out where our limitations are. We're just in the very early stages of finding those parameters of where we're limited. And VR does have cameras on it but you're not projecting your real environment. Those are more for your position within the environment than for pulling in images into your projected environment, if that makes sense. Everything in your virtual environment has been created and preestablished. So everything is really set up ahead of time. I wish I had thought to pull in one of my landscapes. It kind of portrays that idea.