Fusion of perceptions for perceptual robotics

Ö Ciftcioglu*, M. S. Bittermann, I. S. Sariyildiz

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Fusion of perception information for perceptual robotics is described. The visual perception is mathematically modelled as a probabilistic process obtaining and interpreting visual data from an environment. The visual data is processed in a multiresolutional form via wavelet transform and optimally estimated via extended Kalman filtering in each resolution level and the outcomes are fused for each data block. The measurement involves visual perception in the virtual reality which has direct implications prominently in both design and perceptual robotics including navigation issues of actual autonomous robotics. For the interaction with the environment and visual data acquisition, the laser beams approach in robotics is considered and implemented by means of an agent in virtual reality which plays the role of robot in reality.

Original languageEnglish
Title of host publicationAnnual Conference of the North American Fuzzy Information Processing Society - NAFIPS
Pages511-518
Number of pages8
DOIs
Publication statusPublished - 2006
Externally publishedYes
EventNAFIPS 2006 - 2006 Annual Meeting of the North American Fuzzy Information Processing Society - Montreal, QC, Canada
Duration: 3 Jun 20066 Jun 2006

Publication series

NameAnnual Conference of the North American Fuzzy Information Processing Society - NAFIPS

Conference

ConferenceNAFIPS 2006 - 2006 Annual Meeting of the North American Fuzzy Information Processing Society
Country/TerritoryCanada
CityMontreal, QC
Period3/06/066/06/06

Fingerprint

Dive into the research topics of 'Fusion of perceptions for perceptual robotics'. Together they form a unique fingerprint.

Cite this