Overhaul in handling of big-audience Instagram, Facebook accounts underway
Meta pledges to adopt most of the 32 recommended changes to its 'cross-check' programme made by an independent review board
SAN FRANCISCO: Meta said it will modify the company's criticised special handling of posts by celebrities, politicians and other big audience Instagram or Facebook users, taking steps to avoid business interests swaying decisions.
The tech giant promised to implement in full or in part most of the 32 changes to its "cross-check" programme recommended by an independent review board that it funds as a sort of top court for content or policy decisions.
"This will result in substantial changes to how we operate this system," Meta global affairs president Nick Clegg said in a blog post.
"These actions will improve this system to make it more effective, accountable and equitable."
Meta declined, however, to publicly label which accounts get preferred treatment when it comes to content filtering decisions and nor will it create a formal, open process to get into the programme.
Labeling users in the cross-check programme might target them for abuse, Meta reasoned.
The changes came in response to the oversight panel in December calling for Meta to overhaul the cross-check system, saying the programme appeared to put business interests over human rights when giving special treatment to rule-breaking posts by certain users.
"We found that the programme appears more directly structured to satisfy business concerns," the panel said in a report at the time.
"By providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm."
Meta told the board that the programme is intended to avoid content-removal mistakes by providing an additional layer of human review to posts by high-profile users that initially appear to break rules, the report said.
"We will continue to ensure that our content moderation decisions are made as consistently and accurately as possible, without bias or external pressure," Meta said in its response to the oversight board.
"While we acknowledge that business considerations will always be inherent to the overall thrust of our activities, we will continue to refine guardrails and processes to prevent bias and error in all our review pathways and decision making structures."
-
SpaceX: Falcon 9 boosts record-setting ‘Cygnus XL’ cargo spacecraft toward the ISS
-
NASA Artemis II mission: real or fake conspiracies spread online
-
‘Howl at the Moon’: NASA’s new strategy for cosmic curiosity
-
Inside deadly chimp ‘civil war’ in Uganda—What they reveal about human nature
-
NASA Artemis II splashdown: What could go wrong on mission’s final stage
-
What happens to human body in deep space? NASA Artemis II will find out
-
Scientists find Earth’s core may be leaking gold
-
Mission for the ages: Artemis II crew returns after breaking deep-space historic records
-
Robot dogs on Mars: Swiss researchers reveal how autonomy speeds up space exploration
-
From Apollo to Artemis: How astronauts honor loved ones with lunar names
-
Emperor penguins on verge of extinction: ‘A grim story shaped by climate change’
-
NASA Artemis II crew prepares Earth return after historic Moon mission