Abstract
Outdoor mapping and localization based on appearance is especially challenging since usually separate processes of mapping and localization are required at different times of day. The problem is harder in the outdoors where continuous change in sun angle can drastically affect the appearance of a scene. In this work, we propose a method for instantaneous visual direction determination for the autonomous mobile platforms assuming the mobile platform travels along a routine route. We propose a deep convolutional neural network based algorithm for classification of instantaneous images of the path to be followed. The model is tested on SeqSlam dataset and a success performance of %78.5 is achieved. Hidden layer weights are analyzed to ensure that the learning is actually achieved. Experimental results suggest that deep neural networks yield high recognition rates of images to be used for autonomous movement. Approach will be tested on a novel dataset and its performance will be realized in realtime as future work.
Translated title of the contribution | Deep learning based autonomous direction estimation |
---|---|
Original language | Turkish |
Title of host publication | 2016 24th Signal Processing and Communication Application Conference, SIU 2016 - Proceedings |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 1645-1648 |
Number of pages | 4 |
ISBN (Electronic) | 9781509016792 |
DOIs | |
Publication status | Published - 20 Jun 2016 |
Event | 24th Signal Processing and Communication Application Conference, SIU 2016 - Zonguldak, Turkey Duration: 16 May 2016 → 19 May 2016 |
Publication series
Name | 2016 24th Signal Processing and Communication Application Conference, SIU 2016 - Proceedings |
---|
Conference
Conference | 24th Signal Processing and Communication Application Conference, SIU 2016 |
---|---|
Country/Territory | Turkey |
City | Zonguldak |
Period | 16/05/16 → 19/05/16 |
Bibliographical note
Publisher Copyright:© 2016 IEEE.