Abstract
One of the main challenges of navigation systems is the inability of orientation and insufficient localization accuracy in indoor spaces. There are situations where navigation is required to function indoors with high accuracy. One such example is the task of safely guiding visually impaired people from one place to another indoors. In this study, to increase localization performance indoors, a novel method was proposed that estimates the step length of the visually impaired person using machine learning models. Thereby, once the initial position of the person is known, it is possible to predict their new position by measuring the length of their steps. The step length estimation system was trained using the data from three separate devices; capacitive bend sensors, a smart phone, and WeWALK, a smartcane developed to assist visually impaired people. Out of the various machine learning models used, the best result obtained using the K Nearest Neighbor model, with a score of 0.945 R2. These results support that indoor navigation will be possible through step length estimation.
Original language | English |
---|---|
Title of host publication | FLEPS 2020 - IEEE International Conference on Flexible and Printable Sensors and Systems |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
ISBN (Electronic) | 9781728152783 |
DOIs | |
Publication status | Published - 16 Aug 2020 |
Event | 2020 IEEE International Conference on Flexible and Printable Sensors and Systems, FLEPS 2020 - Virtual, Manchester, United Kingdom Duration: 16 Aug 2020 → 19 Aug 2020 |
Publication series
Name | FLEPS 2020 - IEEE International Conference on Flexible and Printable Sensors and Systems |
---|
Conference
Conference | 2020 IEEE International Conference on Flexible and Printable Sensors and Systems, FLEPS 2020 |
---|---|
Country/Territory | United Kingdom |
City | Virtual, Manchester |
Period | 16/08/20 → 19/08/20 |
Bibliographical note
Publisher Copyright:© 2020 IEEE.