Meta Platforms Inc.’s Oversight Board said today that it believes it’s time for Meta to change the rules regarding its adult nudity and sexual content policies.
It’s not the first time the board, which polices Meta’s decisions on moderation, has asked for sweeping changes. This time the board says Meta, the parent company of Facebook and Instagram, needs to be clearer about why nudity is being posted and when it should result in content being taken down. The action comes after Instagram deleted two posts of nonbinary and transgender people showing bare chests.
In both images that Meta decided to act against, the two people are showing their chests but covering up their nipples. In the caption, they express that they want to have “top surgery,” a procedure that will greatly flatten the chest. The couple was hoping to raise some money so they could both have this surgery, which, for them, was a call to the public to help them improve their well-being.
It seems some people who saw the posts were upset and later issued a complaint. Meta’s systems had, by that time, already flagged the posts. After review by humans, Meta decided that the images contained breasts and so were in violation of its Sexual Solicitation Community Standard. The two people appealed to Meta, and later, they went to the Oversight Board.
The board said the removal of the posts was not in line with Meta’s Community Standards and not in line with the company’s “values or human rights responsibilities.” The board said that these cases highlight a “fundamental” flaw in Meta’s moderation policies.
“Meta’s internal guidance to moderators on when to remove content under the Sexual Solicitation policy is far broader than the stated rationale for the policy or the publicly available guidance,” the board said in a post today. “This creates confusion for users and moderators and, as Meta has recognized, leads to content being wrongly removed.”
The board added that the posts being flagged by an algorithm isn’t a problem since a breast is a breast, but it said there’s no excuse for such discrimination when a human has reviewed the image. The board suggested that Meta update its policies to do a better job of moderating where “intersex, non-binary and transgender people” are concerned.
The board also said it’s not practical for moderators “to make rapid and subjective assessments of sex and gender.” It acknowledged that there should be rules regarding nudity but said Meta’s rules are poorly defined.
The board suggested some changes be made that will create more transparency about the rules and ensure less discrimination against transgender and nonbinary people in the future. On its own website, Meta admitted changes are needed, saying it will “implement the board’s decision once it has finished deliberating.”