Şekil tabanli hizh yürüme yöntemi ile nesne bölütleme ve tanima

Translated title of the contribution: Segmentation and recognition system with shape-driven fast marching methods

Abdulkerim Çapar*, Muhittin Gökmen

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)

Abstract

We present a variational framework that integrates the statistical boundary shape models into a Level Set system that is capable of both segmenting and recognizing objects. Since we aim to recognize objects, we trace the active contour and stop it near real object boundaries while inspecting the shape of the contour instead of enforcing the contour to get a priori shape. We get the location of character boundaries and character labels at the system output. We developed a promising local front stopping scheme based on both image and shape information. A new object boundary shape signature model, based on directional Gauss gradient filter responses, was also proposed. The character recognition system employs the new boundary shape descriptor outperformed other well-known boundary signatures such as centroid distance, curvature etc.

Translated title of the contributionSegmentation and recognition system with shape-driven fast marching methods
Original languageTurkish
Title of host publication2006 IEEE 14th Signal Processing and Communications Applications Conference
DOIs
Publication statusPublished - 2006
Event2006 IEEE 14th Signal Processing and Communications Applications - Antalya, Turkey
Duration: 17 Apr 200619 Apr 2006

Publication series

Name2006 IEEE 14th Signal Processing and Communications Applications Conference
Volume2006

Conference

Conference2006 IEEE 14th Signal Processing and Communications Applications
Country/TerritoryTurkey
CityAntalya
Period17/04/0619/04/06

Fingerprint

Dive into the research topics of 'Segmentation and recognition system with shape-driven fast marching methods'. Together they form a unique fingerprint.

Cite this