Facebook and Instagram may ‘free the nipple’ – but just for some

Woman covering her breasts with her hand.
Photo credit Getty Image

In a Meta Oversight Board decision published Jan. 17, Facebook and Instagram parent company’s board overturned “original decisions to remove two Instagram posts depicting transgender and non-binary people with bare chests.”

It also recommended that “Meta change its Adult Nudity and Sexual Activity Community Standard so that it is governed by clear criteria that respect international human rights standards.”

These recommendations were made in relation to two cases – one from 2021 and one from last year – regarding posts from an account maintained by a U.S.-based couple who identify as transgender and non-binary.

“Both posts feature images of the couple bare-chested with the nipples covered,” said a summary of the cases. “The image captions discuss transgender healthcare and say that one member of the couple will soon undergo top surgery (gender-affirming surgery to create a flatter chest), which the couple are fundraising to pay for.”

Automated Meta systems triggered a review of the posts for potential violations of community standards. After this review, Meta removed both posts for violating the Sexual Solicitation Community Standard, “seemingly because they contain breasts and a link to a fundraising page.”

However, when the users appealed to Meta and then to the board Meta found it had removed the posts in error and restored them.

According to The New York Times, there was a similar case in 2018 related to Instagram user Rain Dove, a gender-nonconforming. In one video Dove posted, they were playing basketball and in another, they were drinking from a gallon of milk while wearing only boxer briefs.

“Dove threatened to take legal action if Instagram continued to remove their posts, and eventually, the bare-chested images were allowed to stay,” said the outlet.

“No one’s head exploded,” Dove said. “We’re all going to be fine!”

“The Oversight Board finds that removing these posts is not in line with Meta’s Community Standards, values or human rights responsibilities,” said board of the more recent cases. “These cases also highlight fundamental issues with Meta’s policies.”

It went on to explain that “Meta’s internal guidance to moderators on when to remove content under the Sexual Solicitation policy is far broader than the stated rationale for the policy, or the publicly available guidance,” which in turn “creates confusion for users and moderators and, as Meta has recognized, leads to content being wrongly removed.”

Meta’s Standard prohibits images containing female nipples other than in specified circumstances, such as breastfeeding and gender confirmation surgery.

For years, women have advocated against this policy and in 2014 the “Free the Nipple” feature film premiered.

“Censoring photography is invalidating it as an art form,” said Joanne Leah, a Brooklyn-based photographer who estimated that she has about one post removed every month, per a 2019 article in The New York Times.

“In defending its policies, the company emphasizes its vast global reach: 2.4 billion monthly users on Facebook, and over 1 billion on Instagram, both in 100 languages,” said the article. It also said that Instagram’s head of public policy, Karina Newton, said in an email that the site isn’t trying to “impose its own value judgment” on nipples.

“We’re trying to reflect the sensitivities of the broad and diverse array of cultures and countries across the world in our policies,” Newton said. She also said that Instagram’s system for censoring nipples was “imperfect” and that “mistakes may be made.”

The board acknowledged this week that Meta’s current policy is “based on a binary view of gender and a distinction between male and female bodies.”

This approach “makes it unclear how the rules apply to intersex, non-binary and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender, which is not practical when moderating content at scale,” the board added.

Additionally, it said that the “restrictions and exceptions to the rules on female nipples are extensive and confusing, particularly as they apply to transgender and non-binary people,” and that exceptions to the policy are convoluted and poorly defined.

“In some contexts, for example, moderators must assess the extent and nature of visible scarring to determine whether certain exceptions apply,” said the board.

“It’s culturally specific because obviously in other countries there’s less of a fear of the nipple,” said actress and director Olivia Wilde of her support of the Free the Nipple campaign, according to Vogue. “I think that we can all really benefit from making sure that we don’t allow the stigmatization of women’s bodies to infect our own perspective of ourself.”

Actress Florence Pugh was also quoted by the outlet about her choices to free her own nipples on red carpets.

“I’ve never been scared of what’s underneath the fabric,” Pugh said. “If I’m happy in it, then I’m gonna wear it. Of course, I don’t want to offend people, but I think my point is: How can my nipples offend you that much?”

“We have this kind of puritanical perspective on nipples,” Wilde said. “I think it’s really silly.”

Ultimately, the board determined “that Meta’s policies on adult nudity result in greater barriers to expression for women, trans, and gender non-binary people on its platforms. For example, they have a severe impact in contexts where women may traditionally go bare-chested, and people who identify as LGBTQI+ can be disproportionately affected, as these cases show. Meta’s automated systems identified the content multiple times, despite it not violating Meta’s policies.”

Going forward, the board said that Meta “should seek to develop and implement policies that address all these concerns.”

This would include a new approach to managing nudity on its platforms with clear criteria to govern the Adult Nudity and Sexual Activity policy. With this new approach, the company would be required to ensure “all users are treated in a manner consistent with human rights standards.”

“It should also examine whether the Adult Nudity and Sexual Activity policy protects against non-consensual image sharing, and whether other policies need to be strengthened in this regard,” said the board.

Featured Image Photo Credit: Getty Image