Human Intuition and Machine Generation: Exploring Emotional Authorship in AI-Assisted Design

Faculty Mentor

Travis Masingale

Presentation Type

Poster

Start Date

4-14-2026 11:30 AM

End Date

4-14-2026 1:30 PM

Location

PUB NCR

Primary Discipline of Presentation

Design

Abstract

This project explores how artificial intelligence influences emotional expression and visual decision-making in design, and where the boundaries between human intuition and machine generation begin to emerge. Through a series of visual case studies and a music perception experiment, the research investigates how emotional intention shifts when humans collaborate with AI tools. The visual experiments compare human-designed compositions with AI reinterpretations in order to observe how AI alters, reshapes, or amplifies emotional tone through aesthetic elements such as color, gradients, lighting, and visual atmosphere. By prompting AI systems to reinterpret human-created visuals, the project examines how machine-generated variations influence design decisions and emotional perception. In addition to visual exploration, a music listening test asks participants to distinguish between AI-generated and human-created songs. This experiment explores how audiences interpret emotional authenticity and whether listeners perceive differences in emotional depth between human and AI-generated media. Together, these experiments examine emotional authorship across three perspectives: human intention, machine interpretation, and audience perception. The findings suggest that meaning in AI-assisted design does not belong solely to the designer or the machine, but instead emerges through an iterative collaboration between human creativity, algorithmic generation, and the individuals experiencing the work.

This document is currently not available here.

Share

COinS
 
Apr 14th, 11:30 AM Apr 14th, 1:30 PM

Human Intuition and Machine Generation: Exploring Emotional Authorship in AI-Assisted Design

PUB NCR

This project explores how artificial intelligence influences emotional expression and visual decision-making in design, and where the boundaries between human intuition and machine generation begin to emerge. Through a series of visual case studies and a music perception experiment, the research investigates how emotional intention shifts when humans collaborate with AI tools. The visual experiments compare human-designed compositions with AI reinterpretations in order to observe how AI alters, reshapes, or amplifies emotional tone through aesthetic elements such as color, gradients, lighting, and visual atmosphere. By prompting AI systems to reinterpret human-created visuals, the project examines how machine-generated variations influence design decisions and emotional perception. In addition to visual exploration, a music listening test asks participants to distinguish between AI-generated and human-created songs. This experiment explores how audiences interpret emotional authenticity and whether listeners perceive differences in emotional depth between human and AI-generated media. Together, these experiments examine emotional authorship across three perspectives: human intention, machine interpretation, and audience perception. The findings suggest that meaning in AI-assisted design does not belong solely to the designer or the machine, but instead emerges through an iterative collaboration between human creativity, algorithmic generation, and the individuals experiencing the work.