Saturday, November 15, 2025
spot_img
HomeInstagram, Facebook: Meta is introducing changes to managing large accounts.

Instagram, Facebook: Meta is introducing changes to managing large accounts.

- Advertisement -

SAN FRANCISCO: Meta said it would modify the company’s critical exclusive handling of celebrities, politicians and other large-audience users on Instagram or Facebook, taking steps to avoid decisions based on business interests.

The tech giant has pledged to implement in whole or in part most of the 32 changes to its “cross-check” program that have been recommended by an independent review board to the high court for content or policy decisions. Provides funds.

“This will result in significant changes to the way the system operates,” META president of global affairs Nick Clegg said in a blog post.

“These measures will improve the system to make it more efficient, responsive and fair.”

However, Meta has declined to publicly label which accounts receive preferential treatment when it comes to content filtering decisions, nor has it created a formal, open process for joining the program. will give.

Metta argued that labeling users in the CrossCheck program could expose them to abuse.

The changes came in response to a monitoring panel in December that called on Meta to overhaul its cross-check system, saying the program was giving special treatment to posts from some users that break the rules. Prioritizes business interests over human rights.

“We found that the program appears to be structured directly to address business concerns,” the panel said in a report at the time.

“By providing additional protection to a select few users primarily based on business interests, CrossCheck allows content that would otherwise be removed immediately and to persist for longer periods of time, potentially causing harm. May be.”

Metta told the board that the program is intended to avoid content removal errors by providing an additional layer of human review on posts by high-profile users that initially appear to violate the rules, the report said.

“We will continue to ensure that our content moderation decisions are made as consistently and accurately as possible, without bias or outside pressure,” Meta said in its response to the oversight board.

“While we recognize that business considerations will always be inherent to the overall thrust of our activities, we will continue to improve safeguards and processes to prevent bias and error in all of our review pathways and decision-making structures.”

- Advertisement -
RELATED ARTICLES

Leave a Reply

- Advertisment -spot_img

Most Popular