Life Assisting Tools

We integrate state-of-the-art sensing and sensory stimulation with Light detection and ranging (LiDAR), Machine learning, Advanced haptic naviGatIon system, and wEearable sensors (iMagine) to create the real-time haptic 3D map in conjunction with auditory feedback of the environment.

 

We show the details of the surroundings to the visually impaired via wearable devices equipped with haptic, audio, and other types of sensors.


According to the users' health surveillance information (e.g., weight, bone density, etc.), we observe the users' gait and location in real-time and implement machine learning techniques to provide feedback to avoid falls.