“Too often, harmless content gets taken down, or restricted, and too many people get penalized unfairly,” the company's president of global affairs, Nick Glegg, told reporters on Monday, Dec. 2, according to a report in The Verge.
Meta's platforms include Facebook, Instagram, Threads, and WhatsApp.
Despite advancements in AI moderation, Clegg acknowledged there is still “work to do” in ensuring fair and accurate enforcement, the report said.
Meta has promised to refine its moderation tools and policies, but the admission underscores the challenges of managing massive online communities while protecting free speech.
Click here to read the complete report from The Verge.
Click here to follow Daily Voice Winchendon and receive free news updates.