Visual perception with color for architectural aesthetics

Michael S. Bittermann, Ozer Ciftcioglu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Citations (Scopus)

Abstract

Studies on computer-based visual perception and aesthetical judgment for architectural design are presented. In the model, both color and the geometric aspects of human vision are jointly taken into account, quantifying the perception of an individual object, as well as a scene consisting of several objects. This is accomplished by fuzzy neural tree processing. Based on the perception model, aesthetical color compositions are identified for a scene using multi-objective evolutionary algorithm. The methodology is described together with associated computer experiments verifying the theoretical considerations. Modeling of aesthetical judgment is a significant step for applications, where human-like visual perception and cognition are of concern. Examples of such applications are architectural design, product design, and urbanism.

Original languageEnglish
Title of host publication2016 IEEE Congress on Evolutionary Computation, CEC 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3909-3916
Number of pages8
ISBN (Electronic)9781509006229
DOIs
Publication statusPublished - 14 Nov 2016
Externally publishedYes
Event2016 IEEE Congress on Evolutionary Computation, CEC 2016 - Vancouver, Canada
Duration: 24 Jul 201629 Jul 2016

Publication series

Name2016 IEEE Congress on Evolutionary Computation, CEC 2016

Conference

Conference2016 IEEE Congress on Evolutionary Computation, CEC 2016
Country/TerritoryCanada
CityVancouver
Period24/07/1629/07/16

Bibliographical note

Publisher Copyright:
© 2016 IEEE.

Keywords

  • Architectural design
  • Color difference
  • Fuzzy neural tree
  • Genetic algorithm
  • Pareto front
  • Visual perception

Fingerprint

Dive into the research topics of 'Visual perception with color for architectural aesthetics'. Together they form a unique fingerprint.

Cite this