Skip to yearly menu bar Skip to main content


Poster

EmoEdit: Evoking Emotions through Image Manipulation

Jingyuan Yang · Jiawei Feng · Weibin Luo · Dani Lischinski · Daniel Cohen-Or · Hui Huang


Abstract:

Affective Image Manipulation (AIM) seeks to modify user-provided images to evoke specific emotional responses.This task is inherently complex due to its twofold objective: significantly evoking the intended emotion, while preserving the original image composition.Existing AIM methods primarily adjust color and style, often failing to elicit precise and profound emotional shifts.Drawing on psychological insights, we introduce EmoEdit, which extends AIM by incorporating content modifications to enhance emotional impact.Specifically, we first construct EmoEditSet, a large-scale AIM dataset comprising 40,120 paired data through emotion attribution and data construction.To make existing generative models emotion-aware, we design the Emotion adapter and train it using EmoEditSet.We further propose an instruction loss to capture the semantic variations in data pairs.Our method is evaluated both qualitatively and quantitatively, demonstrating superior performance compared to existing state-of-the-art techniques.Additionally, we showcase the portability of our Emotion adapter to other diffusion-based models, enhancing their emotion knowledge with diverse semantics.

Live content is unavailable. Log in and register to view live content