
Recently, the clay effect filter went viral across social media platforms; what it does is to transform a real picture into a clay figured character by the AI function embedded in photo editing apps.
So I tried putting one of my favourite female characters from the movie ‘YOLO’ into the app, and the transformation was striking…
I expected the result to be funny-looking but never expected it to render her into a complete male figure? I searched up online to see if others encountered the same problem as me, maybe it was a technical problem within the app?

> Other users on Xiaohongshu posting about them being rendered into a male figure.
As I see more users posting and joking about this problem, something strikes me: Is it a hidden bias within the AI system and has nothing to do with technical issues?
My Thoughts and Analysis
The photo I uploaded to the system shows a strong, determined woman with a muscular build and an intense demeanour, which challenges the stereotypical image of a woman. Therefore the AI connected these features to a male character.
AI algorithms are designed and constructed by humans, so this wasn’t a technical error—it was a bias of societal values encoded in the technology.
AI Doesn’t Make Decisions In A Vacuum
“Search results reflect the values and norms of the search company’s commercial partners and advertisers and often reflect our lowest and most demeaning beliefs”
Safiya Noble
As mentioned in Algorithms of Oppression, the output from AI is shaped and influenced by training data, algorithms and the potential biases of the engineers who develop it. Therefore, no technology is completely objective and provides neutral results, it can mirror societal norms and inequalities.
The Bigger Picture: Misrepresentation and Erasure
As I engaged in more research on this topic, I realised there are multiple AI photo generator tools with the same behaviour: Turning a short-hair woman into a man, a back view photo of a person is by default recognised as a man, woman character without makeup is rendered with makeups and nail polish……

Clay filter by Remini^

Clay filter by Meitu^
I think this reflects a much broader issue: AI systems often misrepresent or erase identities that do not align with mainstream norms.
Noble’s critique of search engine algorithms highlights that these systems often reinforce historical and cultural biases, resulting in a perpetuation of the marginalisation of underrepresented groups. In the case of the clay effect filter, the transformation of female to male evidences the failure of these systems to recognise the wider diversity of femininity.
Why Does It Even Matter?
This is where I want to bring the theory I mentioned in my previous blog. The AI system is embedded everywhere into our everyday tools and applications, tools like these photo editing apps and trends like this clay filter on social media operate within the ‘economy of visibility’ (Banet-Weiser, 2018). They are built to cater to popular cultural norms to attract wider audiences and that perpetuates the invisibility of the diverse appearance and value of femininity.
What Can We Do Then?
The next time an AI tool provides you with a result that makes you feel wrong or uncomfortable, stop and question it, or even report it to the developer. Every conversation about these small glitches helps us to better construct technology that truly works for humans and represents humans.
