TRANSCRIPT TSBVI Tech Tea Time: NOA, the AI vest for blind mobility by biped.ai 5/8/2025 >>Marco: Well, thank you. It's nice to be here. Thank you all. Of you who are coming and joining I'm excited to share about Bipaid, about what we're doing in what's new in the space of electronic travel aids, mobility aids, and AI. So let's get started. So first of all, let's start a little bit about what's a problem that we are solving here at BIPED. Well, just one step before that, just let me share who's Viped. So BIPED is a Swiss company. We're based in Lausanne. Within developing assistive technologies since 2021 officially launched or product um in the US in 2024, we actually launch it almost a couple of years before that in Europe. And… we are developing next generation of mobility aids or electronic travel aids. In me, I'm the COO. I'm also a co-founder of the company I've been in the space for over 10 years now. I was the previous founder of Sunu Band, in case you ever came across it which is another device for mobility we know wristband for factor I joined Bifet last year in January. And since then, I've been working in this incredible technology that we're about to share. So, uh. So the problem we are trying to solve is blind mobility to be more specific As you know, moving around blind is very difficult. And that's because there's many aspects to it. It's not just that is hard because you don't detect or anticipate obstacles But there's many layers. First of all, you have the immediate obstacles, static obstacles. Think about like holes, could be branches, could be walls. Could be chairs that are in your way. So first of all, you need to clear your next step basically Which is well done by traditional aids like the white cane or the guide dog Though some of obstacles can be detected with those aids Like uh overheads Some guide dogs do well at detecting high obstacles. Not all of them. Especially if the user is very tall. And then we have moving threads. This is like people around scooters, pets It's very difficult to predict the traitories of things moving around you being blind Even for tools like the guide dog is complicated to anticipate that those transformants so So there's really that part of the problem is not precisely being solved. Then we have what we call the last year hurdles or like the which are those transitions you need to get to a destination. So let's say you want to go to your house to the church or the school. You have to clear the obstacles in your path. You have to be aware of things moving around you. But you also have to get to the bus stop, find the bus stop, which is most of the information or the QDS are visual. Get up in the right boss, find an open seat inside the boss. Getting the rights down in the right stop, which there's some solutions out there for that like gps But my point is many transitions are very rely a lot is still in vision, like finding the door finding the entrance finding the crosswalk Finding the seeds and many more. And then the orientation part of it, which is a big part is making sure you are going in the right direction. Today we use GPSs, but there's many other orientation elements that are not solved by GPS, like navigating indoors is not i mean gps wouldn't really help you over there right Or in places where the precision of the gps is not great. Like in city centers or in the countryside Sometimes it's hard to get precise GPS. So all these elements makes mobility very hard. But despite being a big challenge, we've seen how other industries have solved this. Effectively, at least we've seen in the industries of autonomous driving how cars have been effective of getting to any destination without the need of drives we have way more in many cities these days that are working perfectly well. And if we want to go beyond that, we even have like rockets that can land from the sky and park. You know coming at hundreds of feet per I mean, super high speeds So that's And kind of the key for this technology is the use of cameras and AI, which haven't been used in this field before For mobility purposes. They have been used for other purposes, but they weren't as effective enough to be used for mobility. So that's what we're doing here. That's what we're doing at NOAA. Or in a way is actually a funk uh I mean, we call it NOAA. Because it's an acronym. Many people call it NOAA. It's fine for us and that it comes from navigation, obstacles and AI. Which are the three core features of the product. This is the first time that a device is integrating all the elements that makes mobility challenging in a single device in a single hands-free solution. So what is NOAA is a mobility assistant that uses AI is a vest. So it goes around your shoulders um like uh Like a backpack, but the straps don't go you know behind your arms. It just stopped right in your chest and then you have two sides. On your right side. On your left side, you have the cameras. We have one which have a 170 degrees field of view So they can detect anything coming around yoga coming close to you to your size and in front of you from the ground up to your head It's important to say these are depth perception cam So these cameras are also emitting infrared light which is you're traveling in the space bouncing off of objects around Which is very good because these cameras are very precise to detect the proximity or assess the proximity to things But also that infrared light will enable them to operational even during nights or in places where there's not not enough light. All the feedback happens through audio, through a headset, and we use shocks bulk conduction headphones and then on the right side of the device you have uh basically the computer or the system that process all the images and an interface with buttons that you can use to trigger all these features. That the device has. And in the backside like just resting behind your neck And these three sides, the two frontal part, the one that's to your right side of the chest, the left side of the chest, and the back side of your neck. Are connected with straps that can be mauled and bent to the to the shape of your body. And in the backside, you only have the battery storage unit basically so that's where the batteries are The device comes with two batteries so you can slide The better you eat and out. So the full set is basically the vest. It just rests on top of your shoulder. You don't have to strap anything We have an additional accessory in case you want like a strap to make it like feel firmly in you but most of the users don't really use that. It just rests comfortably On your shoulders bend it to the, you know, to find the better feet The shocks here bones that are connected to the device. And the app, the mobile app, which is free, available for Android and iOS. And in this app, you will use it for getting on board into the device customize the device to your needs. I'm going to explain more into that. In a moment. And also to receive any updates available from biped. So I'm going to run this short video. To show you how Inoa works We're now on the shoulders of a user. That's Arthur. He's walking in an open area. Viper decks to an obstacle. That's a pedestrian standing on his right. The obstacle was detected by Biped before the cane even touched it. Things do 3D sounds, Arthur knows that the obstacle is slightly to his right, and he continues on his way by taking a step to the left. Biped AI is called Copilot. It understands in real time what's happening around the user With a massive field of view of 170 degrees. The AI can then identify where an obstacle is located. Whether the obstacle is a pedestrian or a car And most importantly, if the obstacle has a risk of collision. The paths that the user is taking. We worked with a research institute from Honda After getting inspiration from self-driving cars technology. So if that pedestrian who was blocking the way starts to walk faster than you. They will not be detected as an obstacle, as your trajectories have no risk of collision. As Arthur continues on his way, he hears a GPS instruction. Turn right at 2 o'clock. Arthur then turns right, and a few meters further, here is a confirmation. Continue straight. There seem to be a couple of obstacles ahead. Their lower pitch sounds indicate to Arthur that there is a risk of collision with an obstacle located below waist level. For more details, Arthur presses the AI button located on the right side of the harness. Our AI reacts after a few seconds and indicates the following. You're facing small posts, which separate you from a parking area. A car is parked about four meters ahead on your left. The footpath appears to continue on your right and then makes a sharp left turn along the side of the building. Arthur resumes his path, finds a sidewalk. And after following the wall, identifies a hole on his right. Biped then provides the next GPS instruction Turn right at 2 o'clock. Upon arriving at his destination. Arthur hears. Your destination is on your right. And by pressing the AI button again He gets an even more precise description to find the building's entrance. You're facing the Alanine Corniche 5 building. The path in front of you is clear And it seems that the entrance to the building is located opposite. Under a courtyard about 10 meters away. And that's an example of how Biped uses artificial intelligence to offer three functionalities in a single device. An obstacle detector, a GPS, And an AI scene descriptor. We have designed Biped as the ideal complement to a white cane or a guide dog. After three years of research and development. More than 250 test sessions Hundreds of professionals and testers. We can't wait for you to take Biped on your next adventures. Okay. So that's Noah in a nutshell um Now let's go quickly over the features. So we discussed three core features, navigation obstacles and AI. But this is what they have or what's available for each feature. So let's start with the navigation features, which go along with the end of NOAA. Well, of course, we have turn-by-turn directions and that you saw that in the video. So the device is telling you to turn right, left. If you reach your destination and then you can adjust the way it speaks to you using either standard directions like turn right or turn slightly to your right or clockwise instruction like turn to two o'clock And we are working on implementing also cardinal directions as well So that's going to be an option if you were to hear like instead of Turning right, turn east. Turn southeast. Or similar. The device will also reroute you if you are going too far away from your destination or walking the wrong direction for a few feet. You can also set up places from the app that you go often favorite places. So that could be your home that could be the school your office Any particular address that you want to add there and then you can Use the buttons in the device. To get to that uh or to start the navigation route to that destination. And finally, you can also, this is something we just released. We have the possibility in a web platform to create custom routes which is a grade for O&Ms or people who are starting to explore new routes You can create your own routes and place landmarks also on your way. And those landmarks and the route will be also accessible from your device and you will hear them as you walk On the obstacle side. As previously explained, the device connects to bone conduction headphones. So you have your ears available to hear any environment information that's relevant. And in the background, without covering your ears you will hear beeps in the space to indicate the location of obstacles so he has a 3d effect so if there's an obstacle coming from your right, you'll hear the beep on your right. Or if it's coming from your label here that beeps more on your left side If it's a high obstacle like a tree branch. You're going to hear a high pitch like beep, beep, beep. But if it's a low obstacle, like a trash can, you'll hear a lower pitch, like beep, beep, beep. And if it's like if it's an obstacle below the ground like a hole like a staircase going down then it's even a lower pitch. It sounds like Bo, bo. There was an example in the video, but I can't replicate it exactly but It basically covers everything from 170 degrees from all your sides in front of you from the ground up to your head. And again, something that's relevant to mention is that it's smart the way they provide feedback. So it's not be because of anything, everything that happens around you only if there's a risk of collision Only if the device assess that your territory of the obstacle Is going to collide somewhere in the future. That's when you get the beeps. So this is great because you get a lot less beeps than any other eight out there. And you also get a sense of priority so you can For example, you can be walking and there's a person walking in front of you But the person is walking faster. So he won the tech the person has its threat. But instead, if there's like a bicycle coming very fast to your way, coming from your left, even if it's further away He will prioritize the bike. I tell you, okay, there's an obstacle coming from your left using the beeps So you can stop or beer or take any decision in real time. The thing that's very relevant and it's mentioned pretty much at the end of the video is that with the sign We designed NOAA to be the best complement for canes and guide dogs. In fact, when you set it up the first time. In the app, you got which is your primary eight that you use for mobility. So if that's a cane or a guide dog, then the device will behave differently. It also asks for things like the level the mission condition that you have like Your height. So it calibrates properly to your needs And it works differently. Depending on your primary aid. But in a nutshell, let's say Team users will get more auditory feedback than guide dog users. Because guide dog users don't tend to have too much of a trouble like going around obstacles. On the other hand, King users do need a little bit more information to take those decisions. Because the guide doc is taking some of those decisions for the user. But what it will help, it will help the user know that the there's an obstacle coming. And the dog is about to take a decision. So he will anticipate the decision making of the doc so the user can be aware that the doc may change mirror change direction. Down on the obstacle side. So that's great, especially for people who are new to guide docs They are still in the process of building the trust into the dog, this could be something that reinforces a trust There's also other aspects of it that are uh D friend from a guide dog user to a cane user on the AI aspect, but I'll talk more about it in In the next slide. I also discussed that there's full body protection. This is new. If you look at past lecturing travel aids. Most of them are directional. Most of them have a very low field of So they think about the handheld devices, sonar devices like the mini guy like the boss clip all of those, you have to be aiming them in the right direction to detect obstacles and it will only tell you about obstacles If you're aiming this device, you don't have to worry about aiming it Because it's already catching all the information or processing all the information around you in your front. In other aspects of the obstacle detection element is that it never guides you continuously. It raises awareness of where things are and especially when they are a threat. To your path but it won't guide you to to um like maybe like a guide will do a Which we also do that to encourage skill development So it's up to you what decision to take. With advice only will provide information real time that is relevant or as relevant as possible So you can decide whether you want to stop, whether you want to appear, whether you want to continue Of course, be safe along the way and make sure you're going in the right direction. Now, on the AI side. Well, we have the cameras, the depth perception cameras that are good for detecting obstacles in proximity But they're also good to just observing And understanding context out of it using ai So you can do things like full seat description It will basically take a snapshot of your entire surroundings, describe everything from your left to right. By highlighting points of interest. Now it's very important to say, because there's many other AI devices out there But none of them our focus on mobility. They will give you general descriptions of things around you But what's great about noa Is that it will highlight points of interest. It will skip all the descriptions that are not necessary for mobility. So he won't tell you like what kind of clothes is present in front of you wearing for example It will just tell you, okay, it's a person looking at the screen, looking at his phone. For example, which is relevant because you can anticipate they may not pay attention to you He will highlight what's the layout of the space you are in. We'll tell you if you are in a indoor space, in an outdoor space, if it sees a door, if it's any point of interest from a million elevator. Staircase, it will tell you where it is. And it will use approximate distances as well to tell you where they are. So it will tell you, okay, this is slightly to your left. Eight feet away, for example. And the exit is two years sharp sharp right About 10 feet away. For example so all that is again relevant for mobility You can also use a short scene description So something you have to say about AI, because we are using AI in the cloud The device must be connected to internet to work or for these features to work. We usually use the hotspot of user to connect to internet And the scene description, full scene description could take from maybe three to six seconds to come back. Shorts in description can take I mean, four to six church inscription We'll take maybe… around four seconds So it's faster three to four seconds And it's focused only on your front. So you will use this feature more when you are walking and you want to know what's ahead of you If your path ahead is clearly or what kind of is seeing is going you in your pot ahead. So this is what is great because uh It's just shorter in the amount of information that it provides that is more like more actionable. So I'll tell you, okay, there's a person in front or there's a chair in front. You have to go around it. From your left side And after that, you'll get to cross to the end of the stream. Other way to use the AI is by looking at objects. So let's say you don't want a description, but you are actually looking for something in particular, whether that's a door, a crosswalk. Text, an open seed or here like if you are using a guide dog you want to find a grassy area That's something you can use the buttons of the device to navigate through the items. That you can search for the different classes. So basically you click the buttons, you will hear these doors, crosswalk text and then you get grassy area for dogs. Select it and they will scan the space and tell you, okay, there's a grassy area. About 12 feet slightly to your right next to a tall tree or after a parking lot. So that's another way to do it. That's more intentional. That's when you like are really looking for some something And something that is coming And we're super excited because I think this is going to be a game changer also. Is the live AI descriptions. So now we are beta testing it and it will come um Yeah, this month Basically, before the end of the month Live AI descriptions. Traditionally, what you have to do with NOAA is you have to click a button to trigger the AI, whether you want a full description, a short description. Or search for an object. Right now, with the live AI description. You can enable or disable it and it will constantly streaming information to you as you walk. Now, the information provides is not as detailed as the information you will get in the full C description description. Because it tends to be useful. So it will basically tell you what's changing in your scenario okay you are reaching the end of a street. You are in front of a crosswalk. You are you're getting close to a building There's a bench to your left. And as you walk, he will just highlight things that are changing in your environment so you can be more aware of how this scenario is changing without you having to scan for anything. So I think that's going to be very, very, I mean, at least the beta is going excellent. So we expect users will love this part as well. Aside from all these features, we also support many special cases. The device can be used by people in wheelchairs. This is also a feature in beta. We have a few users trying it So basically, if you're using it and you have a wheelchair or a stroller. The device will filter the frames of the stroller or the wheelchair as an obstacle, which is basically kind of the same that we do with the guide dog and with the cane. We'll filter those so the device doesn't detect those as obstacles. It would calibrate to the height of a wheelchair user and it will enhance or maximize the amount of feedback you can get at ground level. Because this device, as we mentioned in the past, is complementary. It's intended to use with a primary aid. But as we know, people in wheelchairs tend to have a harder time using our white king along the chair So that's why we try to provide more information on ground level. That said, there's kind of a limit to it. Because even though the device can detect things at ground level, the level of resolution is not as precise as what the K will give you So we can detect things that are, let's say for the wheelchair users, a foot off The ground floor below the ground sometimes some curbs might not be detected. Or very small like a rock a small rock, things like that will be hard to catch. That said, it's still a useful to detect things like trash cans, like staircases. Or many other things at floor level. We also have a hard of hearing mode. So for people who have difficulties hearing the 3d audio We can play different tones, different sounds depending on right or left. To prevent that from happening. The device could also connect to Bluetooth ear plants. So cochlear implants, they use bluetooth So that's also another possibility. And so… So why Noah? What makes us different? We know there's being many other devices out there that have been around for For many years and then for years Another few that are being announced that will come in the next years. So what makes NOAA makes First of all, I think what's quite groundbreaking is that never in the past a mobility aid has been used cameras precisely their perception cameras to the tech obstacles. That's a new technology. That never has been seen. If you look at previous technology, all of them use sensors like solar sensors, infrared sensors, laser sensors. And those are great, very precise and taking obstacle and very efficient. But the problem is that they lack environment understanding. So there's just limited amount of information you can get from a sensor because they are essentially blind as well Just very good at detecting proximity to things. But cameras have this advantage that They are now with that perception, they are good at detecting proximity to things, but they are also great at capturing more information of the environment that can be useful to the user. Other aspect, some other key aspects of NOAAF is that it's designed to be the best complementariat So Again, going back to other devices out there Many of them either claim to replace primary age, replace the cane, or replace the guide Or don't claim to replace them but are not designed to complement them well. We took the approach of making sure this is the best complement for a keen user. Best compliment for a guide dog user. This is the best complement of what a wheelchair user. So we are adapting our technology to different use cases. And that's not only on the technology front, but also on the instructional front. So we have different guides, different tutorials different training materials for O&Ms who are onboarding users that use guide dog or canes So I think that's very relevant. And that's also quite innovative that no one has really took a lot of or put a lot of attention to this. Other aspect that is quite unique is that We seek that our design improves the O&M skills of the user. So like I said before, the devices and guides you continuously It raises awareness of where things are. It gives you the information that probably a sighted guy will give you. But it's not moving. In any direction, it tells you okay there's There's this door, there's this bench. You can hear the beepsi. If there's an obstacle coming your way. But it's up to you to take the decision. It's up to you to take your cane. And get to and get to that door. Or get to your final destination and get to the open seat in a bus or in a restaurant. So that's something I think many people appreciate. And lastly, we are assessing risk in a smart way. So pretty tech stack tech only basically triggers the alerts, whether that's a beep or whether that's haptic vibration. As soon as I see detects an obstacle. And we don't. We are tracking multiple obstacles all the time assessing their traitories as they move around you, assessing your trajectory and only if there's a risk of collision between your territory and the traitorial of other obstacle around Then we will be prioritize and trigger a beep. So you can be aware of that. So in the experience, what this means is that you get a lot less beeps that you usually get compared to other devices. In fact, we made a study To bring device to traditional sort of devices and we found out that our device triggers 77% less than existing aids. So that's why Noah, it's quite groundbreaking and disrupting this market But let's see what our users say. So like we mentioned, before launching it, we've beta tested the device with over 250 people Right now, we've been in Europe at least launch a device for over three years. Our users have walked Over 2,000 miles. Using NOAAF, we have users in nearly 40 countries Right now. While we have comments like for example his tour from Seascape. I honestly don't think I ever purchased anything that has so much potential to be life-changing. I think it touches a great point, especially because he has he was since the beginning with us, he saw the evolution of the product from just detecting obstacles to now using GPS navigation and AI description so that's one of the aspects that is very powerful. This product is evolving constantly and all the updates are available for free. Or from CS line, which is A blind organization in switzerland. My opinion is the most exciting technology we've seen in the last years. And like this we have a lot more testimonials we have video testimonials so you want to go And search for other users you can just go to YouTube and search um lower testimonials, wiped testimonials, and we have a full list Of all the videos that are being shared to us. Like I said, we also made a study, we made an impact. We made two studies. So one was an impact study. So we asked our users how they're doing with their mobility. After a period of time of wearing NOAA, In a second study comparing our device to others. So this impact study, we were evaluating the O&M competence According to the results. Users agree the device reduces the mental effort required to navigate. And that was a score of 4.2. 25 over or by 5. And on the other like significant other results that we got here is like. Besides reducing the mental effort. To navigate, it also reduces the stress It increased their activity so they go out more often before the device. And improve their overall competence. And like I said. We also have another study. This was published in MedRxiv Metrics. Which is a medical Our website for medical publications. Where we compare device to other, in this case, it was the bus clip, the one I have here in the screen when we compare to others. We found out that our device triggers 77% less than this this other device. And that's not it. I think another element that is key to our design is that we have many we have integrated the device Or in our design, we have taken into account the their request for orientation mobility specialist We consider O&Ms one of our users as well as a blind user. Of course. With different features and different elements to it. So here's what we have for O&Ms. So first of all, we have an app, the companion app. So when you are giving training to someone using NOAA, that person can hear what Noah is telling them in real time. They can hear the beeps. They can hear the descriptions. They can hear the turn by turn directions as they walk. But that information is not accessible for the OEM. So how the O&M will know what's happening Well, that's what we do with this app. This app has three screens, a screen that shows a top view of the user and the obstacles around. All the obstacles are like gray dots But if there's an obstacle there is an obstacle that is in a risk of collision with the user, the dot will turn red. So the owner will know that, okay, that red dot is being alerted through beeps to the user. It also has a map of the current route And the last instruction being mentioned. And he also has another view The last description being told to the user through audio So instead of hearing the voice, the owner will read the description that was provided to the user. So that's the app. Besides from the app. We also have a… The capability of the user to or the O&M to adjust the device remotely so if So for example, the O&M is seeing that The calibration can be optimized. By narrowing the field of view or the lateral detection of obstacles. Or extending the range, all those elements can be adjusted from the app. That they're using. Besides from these companion app feature to ONMs, we also have an onboarding in the app. So this onboarding has 11 lessons and it's probably going to be 13 soon which basically guides the user to all the features and make simulations of different scenarios and the users can experience the device like the different beeps and different descriptions before going out and using noa in fact The user must clear all these 11 lessons to unlock NOAA so it can be used for the first time. So that's something quite interesting. The device will also guide you to build your profile by selecting your primary, selecting your So all of that is guided through the app. It's quite easy to do. And let's see. Lastly, but not least. I mean, there's actually two more so it's not the last one, but this is a very important one. And then you can design routes to the user and so that the user can basically upload that or download better said that route to the device. So it's not only a custom route, you can also mark zones like, okay, this is a danger zone. This is a safe zone. You can place landmarks that are key to the route. All of that through this platform. So again, another great way to make sure to make the most best use of noa and finally their training assets. We have an O&M book that is a full book that teaches you how to train this to someone, whether they are using a keen or a guide dog or whether they're transitioning from one tool to the other. All of that is there. We also have training cheat sheets so i mean the book will give you the all the information but if you want to go out you don't have much time to bring your book with you. You can just carry these cheat sheets that is a single pager Basically uh that has basically quick instructions including the different exercises to abort someone to the device. And then I know when I'm assessment at the end where you can clear with a check mark if you train that skill or show that feature to the user and any comments that are relevant. During your training session. Something we are… very close to launch. Soon is our webinars for O&Ms. So these webinars are great. There's going to be a series of three webinars The first one is focused more on the foundations of electronic travel aids or assistive technology for intentional mobility This is great. I mean, it's not even about NOAA. It's more about All the technology that has been out there since the last 60 years, how it has been evolved, what the new emerging technologies, how it can change the future of mobility. So I recommend you to take At least the very first one. They are certified as well. Will be certified whenever we launch them. And then the next two webinars are more focused on the device itself and know how to how to use it, how to train someone. We are thinking on implementing a loaning program for O&M. So whenever you clear at least the second webinar Because you have to clear the third one to unlock the second one and so forth. But once you clear the second one, you will be able to request a loaner If there's one available so you can train someone with the device now that you've been certified. So, of course, just to wrap it up. The price and the plans. So Noah, the device like i said when you purchase the device, it comes with the best. It comes with a headset It comes with two batteries. Each battery will give you about three hours of maximum use. So but if you use it, I mean, maximum use, I mean like all features together running at the same time continuously, right? But if you're only using one feature or the other, let's say you are not using the obstacle detector or you're not using GPS navigation. It could last for more. Maybe four or five hours depending on how heavily are you using it And you always have the second battery to slide it in When the first one runs out. You have a back to current device and two years of warranty for Nearly $5,000, so 4,99 Or you can pay half Well, almost half, which is 2000 899. And a $49 subscription for four years. And for organizations, if you want to have a demo unit that you can used to show it to people or train people, but not for personal use, but for demo purposes. We also offer a discounted price. For that. So that's pretty much NOAA. I guess in a nutshell It's fine to say that it's fine. It's like carrying three devices. So we have GPS systems We have obstacle detectors whether it's or handheels or wearables And then we also have AI glasses that use cameras and AI. Well, we have all these systems together in NOAA, Focus on mobility. So that means that it's even better when it comes to mobility because it has larger range broader field of view, more precise descriptions And more information for GPS navigation. So that's basically it. I hope you appreciate Goa. I'm open for questions, so feel free to feel free to ask. If you have any questions. >>Donna: And while Marco is answering questions, there are a couple in the chat. Marco, if you wouldn't mind stopping your share, we're going to get switched over and That'll give you time to answer those couple that are in the chat. >>Marco: Oh, yeah. Okay, okay, let's see. Ooh, I didn't. I didn't see these questions. Let me see. Did you mention decision? Is it weather resistant? Okay, I found the first one. Yes, to some extent, it's water resistant, no waterproof. If it's raining, it will outside light raining if it's raining too hard It may trigger some false positives more often And that's because humidity and all the rain And also remember, we are using cameras so So if the crystal gets wet or there's rub falling That their perception can be distorted a little bit So we don't recommend to go out if it's heavy raining or you can use an umbrella and then you should be fine But light rain is great. Also, hot weather cold weather The device comes from a Swiss company so i mean we can reach like very low temperatures over there and it's been fine And same, I've been in Florida during summer And it's working right. Can you share the environment? Yes, you can perfectly hear your environment while you're walking. And this is just because we are using the shocks bone conduction headphones. Which allows you to hear your environment perfectly If the device is doing a description. And in that moment, there's a person coming asking you something or you want to pay attention to special sound specific sound you can just click a button that will completely shut off or pause the device, put it in standby So you can pay attention to your surroundings. And then click the same button to unpause it. So it is very easy to shut off. In case you need to pay attention to your surroundings. And not only that, you can also use a skip button to skip the current description or use the rewind button. So if you missed the description because you were paying attention to To an announcement out there You didn't put attention to the description that was happening at the same time. You're going to click the rewind button And it will replay the last description. Been played. Okay.