UKIP: FB and Google should censor algorithms to prevent the spread of fake news

According tomedia reports, the global new crown virus pandemic has shocked governments, the technology industry and citizens, not only because of the devastating effects of the virus, but also because of the huge amount of misinformation that comes with it. So how to better deal with the spread of false information has become the subject of a global debate, especially on how much responsibility should the technology platform in which false information is spread.

UKIP: FB and Google should censor algorithms to prevent the spread of fake news

Infographic

On Monday, local time, the House of Lords Committee on Democracy and Digital Technology released a report in which it proposed 45 articles to help the UK government take action against widespread misinformation and false information. The report notes that if this threat is not taken seriously, democracy will be undermined and “insignificant”.

During the outbreak, the threat of disinformation became more urgent as conspiracy theories spread on online platforms. The most serious of these is the mistaken recognition of dangerous treatments or the dissuasion of people not taking preventive measures that directly endanger their health. Across Europe, when the new crown pneumonia was mistakenly connected to 5G, it also caused damage to the telecommunications infrastructure.

The report examines how false information was spread during a virus outbreak, and warned that disinformation was a crisis that was “deeper-rooted and could last far beyond the new crown pneumonia.”

“We are living in an era when trust is collapsing,” David Puttnam, chairman of the Commission for Democracy and Digital Technology, said in a statement. People no longer believe they can trust the information they receive or believe what they hear. This is absolutely an erosion of democracy. “

In its recommendations, the commission called large web platforms, especially Google and Facebook, responsible for their control of “black box” algorithms that display content to users. The report says the companies’ denials that their decisions to shape and train algorithms caused harm, which is simply wrong.

The report also says companies should be authorized to review their algorithms to show what they have done. In addition, it recommended that digital platforms be more transparent in content decision-making so that users have a clear understanding of the rules of online debate.

Neither Facebook nor Google immediately responded to requests for comment.