Adaptive visual obstacle detection for mobile robots using monocular camera and ultrasonic sensor

Ibrahim K. Iyidir*, F. Boray Tek, Doǧan Kircali

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

5 Citations (Scopus)

Abstract

This paper presents a novel vision based obstacle detection algorithm that is adapted from a powerful background subtraction algorithm: ViBe (VIsual Background Extractor). We describe an adaptive obstacle detection method using monocular color vision and an ultrasonic distance sensor. Our approach assumes an obstacle free region in front of the robot in the initial frame. However, the method dynamically adapts to its environment in the succeeding frames. The adaptation is performed using a model update rule based on using ultrasonic distance sensor reading. Our detailed experiments validate the proposed concept and ultrasonic sensor based model update.

Original languageEnglish
Title of host publicationComputer Vision, ECCV 2012 - Workshops and Demonstrations, Proceedings
PublisherSpringer Verlag
Pages526-535
Number of pages10
EditionPART 2
ISBN (Print)9783642338670
DOIs
Publication statusPublished - 2012
Externally publishedYes
EventComputer Vision, ECCV 2012 - Workshops and Demonstrations, Proceedings - Florence, Italy
Duration: 7 Oct 201213 Oct 2012

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 2
Volume7584 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceComputer Vision, ECCV 2012 - Workshops and Demonstrations, Proceedings
Country/TerritoryItaly
CityFlorence
Period7/10/1213/10/12

Keywords

  • Obstacle detection
  • ViBe
  • mobile robot
  • ultrasonic sensor

Fingerprint

Dive into the research topics of 'Adaptive visual obstacle detection for mobile robots using monocular camera and ultrasonic sensor'. Together they form a unique fingerprint.

Cite this