Abstract

Understanding the emotional appeal of paintings is a significant research problem related to affective image classification. The problem is challenging in part due to the scarceness of manually-classified paintings. Our work proposes to apply statistical models trained over photographs to infer the emotional appeal of paintings. Directly applying the learned models on photographs to paintings cannot provide accurate classification results, because visual features extracted from paintings and natural photographs have different characteristics. This work presents an adaptive learning algorithm that leverages labeled photographs and unlabeled paintings to infer the visual appeal of paintings. In particular, we iteratively adapt the feature distribution in photographs to fit paintings and maximize the joint likelihood of labeled and unlabeled data. We evaluate our approach through two emotional classification tasks: distinguishing positive from negative emotions, and differentiating reactive emotions from non-reactive ones. Experimental results show the potential of our approach.

Original languageEnglish (US)
Title of host publicationComputer Vision - ECCV 2016 Workshops, Proceedings
EditorsGang Hua, Hervé Jégou
PublisherSpringer Verlag
Pages48-63
Number of pages16
ISBN (Print)9783319466033
DOIs
StatePublished - 2016
EventComputer Vision - ECCV 2016 Workshops, Proceedings - Amsterdam, Netherlands
Duration: Oct 8 2016Oct 16 2016

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume9913 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceComputer Vision - ECCV 2016 Workshops, Proceedings
Country/TerritoryNetherlands
CityAmsterdam
Period10/8/1610/16/16

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Identifying emotions aroused from paintings'. Together they form a unique fingerprint.

Cite this