Advertisement
AI

Review Used By UK to Limit Gender Affirming Care Uses Images of AI-Generated Kids

One image depicts a non-binary child with short pink hair, consistent with how generative AI tends to represent queer people. In another image, a child’s fingers blend together.
Review Used By UK to Limit Gender Affirming Care Uses Images of AI-Generated Kids
A screenshot of the AI-generated nonbinary child in the Cass Review.

A major review condemning the U.K. National Health Service’s gender-affirming treatment for minors uses AI-generated images of kids. The review, released Tuesday, noted the “weak evidence” and “poor quality” of studies on hormone treatments for children, which led to NHS England pausing all gender-affirming care for people under 18. 

The so-called “Cass Review” is peppered with pictures of schoolchildren and students, half of which appear to be generated by AI. Most notably, an image at the end of the “Service model” section of the report, which delineates the 32 recommendations it makes, features a non-binary child with a bleached and light pink quiff haircut. This is consistent with how generative AI tends to represent queer people—namely, white people with short textured purple or pink hair. 

A screenshot of the Cass Review's AI-generated nonbinary child.

The report’s cover image of a child doing a wall sit in jeans also appears to be AI-generated, which is evident from their hands. Two pairs of fingers appear to be merged, while one thumb cuts into the other one. 

A screenshot of the Review's cover image.

Dr. Hilary Cass’s team, who conducted the review, told 404 Media in an email that, “The report uses Adobe stock images—some real and some AI. In selecting images the Review was conscious of the sensitive and contentious nature of the subject matter and made effort not to use images that could identify any individuals.”

Reverse-image searching the AI-generated child with pink hair leads to this AI-generated image on Adobe Stock called “Non binary teen in school hallway with kids in the background. Generative AI.” This image is part of a “series” on Adobe Stock that contains 35 images. In all of the images in this series where someone in the image has pink or purple hair, they are labeled as “nonbinary,” and in some cases they are labeled as a “young nonbinary teen girl.” This specific group of AI images also has a series of group images where all of the teens are labeled as either “nonbinary” or, separately, as “happy teenage students.” These images imagine a world in which every nonbinary person has some variation of the exact same haircut and hair color and in which they exclusively hang out with and pose for pictures with other nonbinary students. AI-generated cis students in this series only hang out with other cis students, meanwhile. 

The review does not appear to acknowledge these AI-generated images in any way. There are no references to “AI”, “artificial intelligence”, or “generate” in the context of images. Nor are there any references to Midjourney, Stable Diffusion, Dall-E, or any other common AI image generators. 

When asked for comment, the NHS, which commissioned the report on gender-affirming healthcare in the U.K., directed 404 Media to contact Dr. Cass’s team. 

AI-generated images have recently begun making appearances in scientific reports. A particularly memorable study on stem cell signaling pathways featured a giant AI-generated rat penis, and was quickly retracted when the image was shared online. Last year, Amnesty International used bizarre, AI-generated images to depict violence and police brutality in Colombia. 

A study published in December in the Findings of the Association for Computational Linguistics found that Stable Diffusion’s depiction of a “person” was by default a “light-skinned” man. “People of nonbinary gender are farthest from this baseline depiction of ‘person,’” the study states. “The stereotypical depiction of personhood within Stable Diffusion outputs corresponds closely to Western, light-skinned men and threatens to erase from media depictions historically marginalized groups such as people of nonbinary gender and Indigenous people, among others.” 

Sourojit Ghosh, a PhD candidate at the University of Washington and the lead researcher of the project, told 404 Media in an email that this erasure of nonbinary people had significant implications for harm. “It contributes to the historic trend of nonbinary identities being erased from mainstream representation, or nonbinary people facing oppression and/or violence simply for being nonbinary,” Ghosh said.

“I think that for AI to depict stereotypical images of what it means to ‘look trans/nonbinary’ has the potential to cause real harms upon real people,” Ghosh continued. “Especially for young people, who might be seeing such images more and more in their daily media diets, this can create an unhealthy impression that there is a ‘correct’ way to present oneself as trans/nonbinary.” 

Earlier this month, WIRED noted that generative AI has a track record of representing queer and trans people as a collage of stereotypes. It is not clear why the Cass team used AI-generated images in this report, which, again, has been used as evidence by the NHS to stop providing gender-affirming care to trans kids.

Advertisement