Derin Öǧrenme Modelleri ile Eskiz Siniflandirma

Translated title of the contribution: Sketch classification with deep learning models

Fevziye Irem Eyiokur, Dogucan Yaman, Hazim Kemal Ekenel

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)

Abstract

Sketch classification problem is challenging due to several reasons, such as absence of color and texture information, lack of detailed information of objects, and the quality, which depends on drawing ability of the person. In this study, sketch classification problem is addressed by using deep convolutional neural network models. Specifically, the effect of domain adaptation is examined, when fine-tuning the convolutional neural networks for sketch classification. By employing domain adaptation, the classification accuracy is increased by around 3%. The proposed system, which utilizes VGG-16 network model and performs two-stage fine-tuning, outperforms the previous state-of-the-art approaches on the TU Berlin sketch dataset by reaching 79,72% accuracy.

Translated title of the contributionSketch classification with deep learning models
Original languageTurkish
Title of host publication26th IEEE Signal Processing and Communications Applications Conference, SIU 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1-4
Number of pages4
ISBN (Electronic)9781538615010
DOIs
Publication statusPublished - 5 Jul 2018
Event26th IEEE Signal Processing and Communications Applications Conference, SIU 2018 - Izmir, Turkey
Duration: 2 May 20185 May 2018

Publication series

Name26th IEEE Signal Processing and Communications Applications Conference, SIU 2018

Conference

Conference26th IEEE Signal Processing and Communications Applications Conference, SIU 2018
Country/TerritoryTurkey
CityIzmir
Period2/05/185/05/18

Bibliographical note

Publisher Copyright:
© 2018 IEEE.

Fingerprint

Dive into the research topics of 'Sketch classification with deep learning models'. Together they form a unique fingerprint.

Cite this