AI X Visual Culture — Automating the Gaze

Faculty Mentor

Travis Masingale

Presentation Type

Poster

Start Date

4-14-2026 11:30 AM

End Date

4-14-2026 1:30 PM

Location

PUB NCR

Primary Discipline of Presentation

Design

Abstract

Artificial intelligence has transformed images from representations into infrastructure that sorts, ranks, and circulates bodies at scale. While generative systems appear technologically novel, they are trained on vast archives of existing visual culture, absorbing historical patterns concerning gender, race, desirability, and authority. This project investigates how those cultural patterns become embedded within AI systems and how automated image production reshapes the way identity, credibility, and consent are visually constructed. Over the course of a ten-week research process, interdisciplinary scholarship from media theory, feminist critique, and artificial intelligence ethics was analyzed to examine how visual hierarchies migrate from cultural imagery into computational systems. This research traces how concepts such as the male gaze, machine vision, and operative images help explain the transformation of representation into data that can be measured, categorized, and reproduced at scale. The project further examines how generative systems amplify existing patterns of desirability through algorithmic repetition, contributing to forms of epistemic harm in which credibility and authority become visually coded. Particular attention is given to the emergence of non-consensual synthetic imagery, where programmable likeness collapses the distinction between image and body, raising questions about consent, agency, and responsibility in AI-mediated visual systems. The outcome of this research is a digital magazine-style editorial essay designed to make these ideas accessible to designers and creative practitioners. By examining how visual culture becomes computational logic, this project argues that responsibility cannot be located only in policy or regulation. Instead, it highlights how design decisions surrounding datasets, system defaults, and generative frameworks shape the cultural patterns that artificial intelligence ultimately scales.

This document is currently not available here.

Share

COinS
 
Apr 14th, 11:30 AM Apr 14th, 1:30 PM

AI X Visual Culture — Automating the Gaze

PUB NCR

Artificial intelligence has transformed images from representations into infrastructure that sorts, ranks, and circulates bodies at scale. While generative systems appear technologically novel, they are trained on vast archives of existing visual culture, absorbing historical patterns concerning gender, race, desirability, and authority. This project investigates how those cultural patterns become embedded within AI systems and how automated image production reshapes the way identity, credibility, and consent are visually constructed. Over the course of a ten-week research process, interdisciplinary scholarship from media theory, feminist critique, and artificial intelligence ethics was analyzed to examine how visual hierarchies migrate from cultural imagery into computational systems. This research traces how concepts such as the male gaze, machine vision, and operative images help explain the transformation of representation into data that can be measured, categorized, and reproduced at scale. The project further examines how generative systems amplify existing patterns of desirability through algorithmic repetition, contributing to forms of epistemic harm in which credibility and authority become visually coded. Particular attention is given to the emergence of non-consensual synthetic imagery, where programmable likeness collapses the distinction between image and body, raising questions about consent, agency, and responsibility in AI-mediated visual systems. The outcome of this research is a digital magazine-style editorial essay designed to make these ideas accessible to designers and creative practitioners. By examining how visual culture becomes computational logic, this project argues that responsibility cannot be located only in policy or regulation. Instead, it highlights how design decisions surrounding datasets, system defaults, and generative frameworks shape the cultural patterns that artificial intelligence ultimately scales.