What has the nipple done to become such a dominant threat? It seems clear that it unsettles social networks, a space long considered prudish even as it supposedly advances. Naked bodies are a central ingredient in the history of art, yet today these vast online galleries have turned them into a taboo—well, sometimes, you’ll see. This stigma touches visual artists most of all, whose works are censored by the algorithm’s rulebook. The phenomenon, described by Nuria Oliver, a respected AI expert and MIT-trained researcher, is what she calls algorithmic censorship of art. It harms creators simply because, in practical terms, if a work is not present on a given social network, it effectively does not exist for most people today, since those are the primary gateways to culture and information.
“All platforms use algorithms to moderate content, and only in rare cases are humans overseeing the process,” explains Oliver. As a researcher with the European Laboratory for Learning and Intelligent Systems (ELLIS), she and her colleagues work to improve these systems. “They focus on the product. Ask any artist what art is, and they will tell you it is a message, the intention, the context, and the process that gives rise to something deeper—not merely the artwork as pixels. Algorithms do not grasp all of that; they only see the pixels,” Oliver notes. She also points out that there are “white lists” where platforms are far more permissive for certain economic interests. Kim Kardashian, for instance, can post things that others cannot.
Shadow banning
In other words, AI judges human art, and the filter through which millions of images pass on each platform boils down to a simple question: Is this art or pornography? The answer remains uncertain. According to the researcher, it depends on the arbitrariness of the algorithms. Nude figures have existed in art since the Venus of Willendorf, and they are essential, yet all too often the nude, especially the female form, is censored. This has real consequences for creators who see their portraits reach vast audiences or, conversely, be suppressed entirely. A troubling possibility is shadow banning, where content remains visible but is hardly shown to anyone, leaving the artist believing the material simply isn’t popular.
Oliver has presented these ideas at a recent conference on intellectual property and the cultural industries in the era of generative AI, hosted by a major authors’ society. Her work aims to make the algorithms more responsive to artistic content so they do not mislabel it as pornographic. How does this work in practice? By expanding the evaluation beyond the image to include contextual textual information and other cues. Early results show meaningful improvements in sensitivity to artistic context.
Her qualitative research involved interviewing artists from several nationalities who had faced censorship, to understand how the issue affects their careers and lives. The goal was to identify concrete steps to mitigate the shadowy censorship that shapes culture and creative expression. A qualitative approach was essential because measuring the scale and impact of algorithmic censorship is challenging when platforms remain opaque.
Oliver illuminates a dual impact on artists: an individual, economic toll from lost opportunities, and a broader social impact that concerns how new generations perceive nudity, gender, and artistic expression. This is not merely a private grievance; it risks shaping cultural norms and educational narratives for years to come.
The discussion underscores a wider truth: platforms moderate content with algorithms that do not always reflect the nuanced nature of art. The work calls for more transparent and context-aware systems, alongside inclusive policies that respect artistic intent and the historical significance of the nude in art. In the end, the goal is to ensure that digital spaces support authentic self-expression without suppressing meaningful cultural dialogue.