Initially, the major social networks such as Facebook and YouTube were praised for their responses to the pandemic, but now they are coming under increasing pressure. On Thursday, more than 100 international doctors and virologists, including Melanie Brinkmann of the TU Braunschweig and Christian Drosten of the Charité, criticized that misinformation on social media could endanger human lives. The widespread and often trivialising false news had already led to people with severe symptoms not going to hospitals and having died.
Facebook and the video platform YouTube, which is part of the Google Group, point to a whole series of new, stricter guidelines for content around the virus. Both companies also acknowledge problems. In both companies, for example, significantly fewer employees are working to check content.
Because of the pandemic, the companies had sent their “moderators”, most of whom work for external companies, to the home office – at Facebook Germany several hundred employees of the service providers Majorel and CCC were affected. From home, for legal reasons, they can only perform parts of their duties.
As a result, both Facebook and YouTube are increasingly relying on machine detection of problematic content that is error-prone. “Because not all people who normally review content can currently work in full, we prioritize certain content and increasingly use automated systems,” a Facebook spokeswoman told SPIEGEL. Permanent employees would also be increasingly employed as auditors. “Despite these measures, there may be longer response times and more errors in the enforcement of our rules.”
Despite the bottleneck, Facebook had issued warnings about 40 million content in March alone, it said, as well as “removing hundreds of thousands of misinformation related to Covid-19 that can cause immediate harm.” Due to staffing problems, the network has also suspended the possibility of appealing against deletion decisions. A return to normal conditions is not yet foreseeable, the network said.
Both Facebook and YouTube are also increasingly linking to official sources and prominently embedding Covid-19 information boards. Posts that question the existence of Corona or advise people to go into treatment deletethe video platform. Posts claiming that mobile phone technology 5G contribute to the spread of the virus are also blocked.
Recently, medical misinformation has increasingly mixed with calls against the pandemic-fighting strategies of the federal and state governments – among other things, people in Facebook groups and YouTube videos called for mass demonstrations.
However, the high number of calls from some conspiracy theorists is not due to their own algorithms, According to YouTube. Instead, users searched specifically for their names and channels. For the creators, however, even questionable Corona videos can apparently continue to pay off – there is no fundamental decision to exclude them from monetization through advertising.
The Group, however, emphasizes that its prioritization of reliable content is having an effect – in the first quarter, the playback time of reliable news content worldwide increased by 75 percent, and tens of thousands of problematic content were deleted. Like Facebook, YouTube is increasingly relying on automatic systems: such systems trained by machine learning are now increasingly being used to delete problematic content directly. Until now, these systems were primarily used to detect problematic content and then forward it to human moderators.
Abu Dhabi, UAE, 19th December 2024, ZEX PR WIRE, The WorldShards team is super excited…
One of Germany’s largest banks Deutsche Bank AG is developing an Ethereum Layer 2 (L2)…
As user activity on its blockchain and DeFi engagement drop, the Cardano price is under…
The price of XRP has moved up more than 150% over the last 30-day period…
Jamie Coutts, chief crypto analyst at Real Vision, revealed that Coinbase’s 12-month revenue has reached…
Phuket, Thailand — From November 30 to December 1, 2024, the first investment forum titled…