AI X Visual Culture — Automating the Gaze
Faculty Mentor
Travis Masingale
Presentation Type
Poster
Start Date
4-14-2026 11:30 AM
End Date
4-14-2026 1:30 PM
Location
PUB NCR
Primary Discipline of Presentation
Design
Abstract
Artificial intelligence has transformed images from representations into infrastructure that sorts, ranks, and circulates bodies at scale. While generative systems appear technologically novel, they are trained on vast archives of existing visual culture, absorbing historical patterns concerning gender, race, desirability, and authority. This project investigates how those cultural patterns become embedded within AI systems and how automated image production reshapes the way identity, credibility, and consent are visually constructed. Over the course of a ten-week research process, interdisciplinary scholarship from media theory, feminist critique, and artificial intelligence ethics was analyzed to examine how visual hierarchies migrate from cultural imagery into computational systems. This research traces how concepts such as the male gaze, machine vision, and operative images help explain the transformation of representation into data that can be measured, categorized, and reproduced at scale. The project further examines how generative systems amplify existing patterns of desirability through algorithmic repetition, contributing to forms of epistemic harm in which credibility and authority become visually coded. Particular attention is given to the emergence of non-consensual synthetic imagery, where programmable likeness collapses the distinction between image and body, raising questions about consent, agency, and responsibility in AI-mediated visual systems. The outcome of this research is a digital magazine-style editorial essay designed to make these ideas accessible to designers and creative practitioners. By examining how visual culture becomes computational logic, this project argues that responsibility cannot be located only in policy or regulation. Instead, it highlights how design decisions surrounding datasets, system defaults, and generative frameworks shape the cultural patterns that artificial intelligence ultimately scales.
Recommended Citation
Loree, Alexis, "AI X Visual Culture — Automating the Gaze" (2026). 2026 Symposium. 37.
https://dc.ewu.edu/srcw_2026/ps_2026/p2_2026/37
Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
AI X Visual Culture — Automating the Gaze
PUB NCR
Artificial intelligence has transformed images from representations into infrastructure that sorts, ranks, and circulates bodies at scale. While generative systems appear technologically novel, they are trained on vast archives of existing visual culture, absorbing historical patterns concerning gender, race, desirability, and authority. This project investigates how those cultural patterns become embedded within AI systems and how automated image production reshapes the way identity, credibility, and consent are visually constructed. Over the course of a ten-week research process, interdisciplinary scholarship from media theory, feminist critique, and artificial intelligence ethics was analyzed to examine how visual hierarchies migrate from cultural imagery into computational systems. This research traces how concepts such as the male gaze, machine vision, and operative images help explain the transformation of representation into data that can be measured, categorized, and reproduced at scale. The project further examines how generative systems amplify existing patterns of desirability through algorithmic repetition, contributing to forms of epistemic harm in which credibility and authority become visually coded. Particular attention is given to the emergence of non-consensual synthetic imagery, where programmable likeness collapses the distinction between image and body, raising questions about consent, agency, and responsibility in AI-mediated visual systems. The outcome of this research is a digital magazine-style editorial essay designed to make these ideas accessible to designers and creative practitioners. By examining how visual culture becomes computational logic, this project argues that responsibility cannot be located only in policy or regulation. Instead, it highlights how design decisions surrounding datasets, system defaults, and generative frameworks shape the cultural patterns that artificial intelligence ultimately scales.