Özet
One of the main challenges of navigation systems is the inability of orientation and insufficient localization accuracy in indoor spaces. There are situations where navigation is required to function indoors with high accuracy. One such example is the task of safely guiding visually impaired people from one place to another indoors. In this study, to increase localization performance indoors, a novel method was proposed that estimates the step length of the visually impaired person using machine learning models. Thereby, once the initial position of the person is known, it is possible to predict their new position by measuring the length of their steps. The step length estimation system was trained using the data from three separate devices; capacitive bend sensors, a smart phone, and WeWALK, a smartcane developed to assist visually impaired people. Out of the various machine learning models used, the best result obtained using the K Nearest Neighbor model, with a score of 0.945 R2. These results support that indoor navigation will be possible through step length estimation.
Orijinal dil | İngilizce |
---|---|
Ana bilgisayar yayını başlığı | FLEPS 2020 - IEEE International Conference on Flexible and Printable Sensors and Systems |
Yayınlayan | Institute of Electrical and Electronics Engineers Inc. |
ISBN (Elektronik) | 9781728152783 |
DOI'lar | |
Yayın durumu | Yayınlandı - 16 Ağu 2020 |
Etkinlik | 2020 IEEE International Conference on Flexible and Printable Sensors and Systems, FLEPS 2020 - Virtual, Manchester, United Kingdom Süre: 16 Ağu 2020 → 19 Ağu 2020 |
Yayın serisi
Adı | FLEPS 2020 - IEEE International Conference on Flexible and Printable Sensors and Systems |
---|
???event.eventtypes.event.conference???
???event.eventtypes.event.conference??? | 2020 IEEE International Conference on Flexible and Printable Sensors and Systems, FLEPS 2020 |
---|---|
Ülke/Bölge | United Kingdom |
Şehir | Virtual, Manchester |
Periyot | 16/08/20 → 19/08/20 |
Bibliyografik not
Publisher Copyright:© 2020 IEEE.
Finansman
ACKNOWLEDGMENT This research is funded by a grant from the Scientific Research Project Unit of Istanbul Technical University, Grant No: MGA-2018-41481
Finansörler | Finansör numarası |
---|---|
Istanbul Teknik Üniversitesi | MGA-2018-41481 |