Facebook said Wednesday it would place “authoritative” coronavirus content at the top of user feeds as it scrambled to keep up with increased usage and stem the flow of misinformation on its platform and WhatsApp messaging.
The leading social network said it has nearly doubled server capacity to power WhatsApp as people in isolation place more voice and video calls using the popular messaging service.
Advertisements - Click the Speaker Icon for Audio
Facebook also donated $1 million to the International Fact-Checking Network to expand the presence of local fact-checkers and curb misinformation on WhatsApp, said Facebook head of health Kang-Xing Jin.
“Teams are hard at work to make sure all the services run smoothly, because this is clearly a time when people want to stay connected,” Facebook chief executive Mark Zuckerberg said while updating reporters on the company’s efforts.
“We want to make sure we do our part to alleviate loneliness.”
As part of an effort to be a resource for reliable information about the coronavirus crisis, Facebook is rolling out an information center that will be displayed at the top of news feeds at the social network.
The information hub was built in collaboration with health organizations and will roll out in the US and Europe through Wednesday, with plans to expand it to other locations.
“Our goal is to put authoritative information in front of everyone who uses our services,” Zuckerberg said.
The hub will display content from public health experts, celebrities, academics and others encouraging ways to reduce coronavirus risk — such as by taking social-distancing seriously, according to Zuckerberg.
Facebook has been grappling with the challenge making it possible for content to be moderated at home by workers, many of them contracted through outside companies, who are working remotely to reduce coronavirus risk.
“This is a big one we have been focused on for the past few days,” Zuckerberg said.
“There are certain kinds of content moderation that are very sensitive — such as suicide and self harm — and if you are working on that content for a long time it can be very emotionally challenging.”
Some content being checked by moderators also comes with privacy concerns.
Facebook is in the process of moving the most sensitive types of content moderation to full-time employees for now, Zuckerberg said.
“I am quite worried the isolation of people staying at home could lead to more depression or mental health issues and I want to make sure we are ahead of that with more people working on suicide and depression prevention, not less,” Zuckerberg said.
“That will cause a trade-off with content not representing imminent physical risk to people.”
Facebook will continue to use artificial intelligence systems to watch for banned content.