Meta says it’s reviewing calls to make adult nudity policies more inclusive

Meta is reviewing a call from its oversight board to broaden its adult nudity policies, a spokesman said, after the tech giant removed two Instagram posts that showed transgender and non-binary people bare-chested.

None of the posts violated Meta’s adult nudity policies, and in a statement released earlier this week, the board said it overturned the company’s decision to remove them.

A Meta spokesman told AFP Thursday the company welcomed the move of the board and had already restored the images, agreeing they should not have been removed.

But the board also took the opportunity to urge Meta, which also owns Facebook, to broaden its broader adult nudity policy.

The current policy “prohibits images of female nipples except in certain circumstances, such as B. breastfeeding and surgeries to confirm gender,” the regulator wrote in its decision.

These policies, she continued, “are based on a binary view of gender and a distinction between male and female bodies” and lead to “greater barriers to the expression of women, trans and gender non-binary people on their platforms”.

It called on Meta to evaluate its policies “so that all people are treated in accordance with international human rights standards, without discrimination based on gender or gender.”

The meta spokesperson said the company is reviewing this request, which reflects years of calls from activists.

“We are constantly evaluating our policies to make our platforms safer for everyone,” the spokesman said. “We know more can be done to support the LGBTQ+ community and that means working with experts and LGBTQ+ advocacy organizations on a range of issues and product improvements.”

“We got Meta thinking,” Helle Thorning-Schmidt, a member of the Oversight Board and a former prime minister of Denmark, said in a forum on Instagram on Thursday. “It’s interesting to note that the only non-sexualized nipples are those of men or those who have had surgery.”

“Over-monitoring of LGBTQ content, and trans and non-binary content in particular, is a serious problem on social media platforms,” ​​a spokesman for advocacy group GLAAD told AFP.

“The fact that Instagram’s AI system and human moderators have repeatedly identified these posts as pornographic and as sexual solicitation indicates serious flaws in both their machine learning systems and moderator training.”

Meta said it would publicly respond to any of the board’s recommendations on the matter by mid-March.

© 2023 AFP

Leave a Reply

Your email address will not be published. Required fields are marked *