Its debut device will feature flexible vibrating bases that wrap around each foot and are inserted into each shoe, and a small pack that resembles an AirPods case that can be detached for charging. Once fitted, the vibrating components are aligned along the sensitive nerves on the foot and can send coded walking instructions to the user. In its present form, this device is intended to help those with low vision navigate a route that comes from a smartphone app. The idea is that it can help eliminate distractions by freeing up the attention that is normally needed to either listen to voice directions or recheck their phones. That way, visually impaired users can concentrate on the safety of the environments in which they are walking, explains Waturo Chino, representative director and CEO of Ashirase. People can use their hearing to listen to traffic sounds and signals, like warning beeps at crossings or sidewalks, and use their hands to carry walking canes or other belongings. It’s not meant to warn against upcoming real-time obstacles, but provide simple, general navigation directions. Inside the device are motion sensors, accelerometers, gyro sensors, and a built-in compass for orientation. Through a bluetooth connection, it’s paired with an app; the smartphone running it needs to have a cellular connection. The app does not support offline maps yet. The app uses information from Google Maps or similar vendors to draw up a walking route to the destination. Based on the destination and how the user is walking, it will send signals to the device that will trigger a vibration on the sides and fronts of the foot. It will vibrate at a regular frequency at the front of the foot to let the users know they’re on the right track and should keep going straight. It will speed up the vibrations once the user is approaching a turn and needs to stop, and it will vibrate in either the left or right shoe to signal the direction the user needs to turn. Ashirase’s new tech is waterproof and washable. It can be charged at the end of the week to power through seven days of walking, accounting for three active hours each day.
An upcoming beta trial and lingering questions around user experience
The system is focused on outdoor navigation for now: it will not work indoors in this first version, but the team is exploring the possibilities of integrating Wi-Fi or bluetooth beacons to position users in those scenarios. It can be fitted in two types of shoes: sneakers and leather lace-up shoes. Ashirase is targeting its product for the 12 million people in developed countries who live with severe vision problems, but who are not completely blind. A pilot of the beta version, planned for October or November this year, will allow those who are visually impaired to try out the device for a week on a trial basis to generate feedback data. The product is expected to wide-release in October 2022. The first step is for Ashirase to initially market its tech to the 1.45 million people with low vision in Japan. It will be available on a subscription basis for 2,000-3,000 yen per month ($18-27) on top of the one-time cost of the device itself (which has not yet been disclosed). The company lists the advantages of its system as being less costly than taxis or guiding helpers, more readily available than seeing eye dogs, and more convenient than assistance from a family member. Erich Manser, a digital accessibility consultant at Harvard University, says that this new tech sounds “cool and promising,” although he has some questions regarding its user experience. He wonders when in practice, how distinguishable the vibrating prompts will be compared to the natural bumps and vibrations that a person feels from the act of walking itself. “Is it easily something that you can tell is definitely different than just walking?” he asks. He’s also curious about how complete of a solution this standalone tech is for helping those with low vision go about their day independently. Manser, who has gradually lost vision over the years due to a disease called retinitis pigmentosa, has previously tested a system called Aira—which can stream what he sees in real-time through Google Smart Glasses to a live agent that can then tell him where to go through a bluetooth headset—to help him run in the Boston Marathon in 2017. Manser is interested in seeing where, how, and how often people will actually use this tech. He admits that he himself has not been the most consistent user of most of these technologies. “They’re exciting, they’re innovative. When they’re new, it’s a cool factor… But when I was around my house, I didn’t really use [Aira] much,” he says. “I would use it frequently if I had to travel into Boston, take the train in, [if] I was going to a business meeting somewhere at an unfamiliar location, that’s where I was a frequent user.” He explains that the reason for this is because Aira, along with a lot of other new accessibility assistance technologies, including Seeing AI, and the Be My Eyes app, are still clunky and can’t fully provide a “seamless” experience. “It has a very separate feel. If I’m using Aira and I’m walking down a city sidewalk, it’s hard for me to engage with other people around me because I’m involved in that activity and it’s all-consuming,” he says. “I’m hopeful for a time where it just becomes a way that naturally integrates into our lives rather than something that is an additional thing. I think that would be a good overarching goal to aspire to.”