A UK report includes 45 recommendations to combat misinformation.

John Lamb / Getty

The globe Coronavirus pandemic has shaken governments, the tech industry, and citizens not only from the devastating effects of the virus, but also from the amount of misinformation that has come with it. What is the best way to combat the spread of false information? is a topic of global debate, especially in terms of how much responsibility the tech platforms that host it host.

In the UK, the House of Lords Committee for Democracy and Digital Technologies released a report on Monday with 45 recommendations to the UK government to take action against the "misinformation pandemic" and disinformation. If the threat is not taken seriously, democracy will be undermined and "irrelevant," it says.

For more of it

Subscribe to the CNET Now newsletter to receive our editors' selection for the most important stories of the day.

During the outbreak, the risk of misinformation and disinformation received a new urgency as conspiracy theories have thrived on online platforms. The worst of these have directly endangered human health by mistakenly advocating dangerous remedies or preventing people from taking precautions against the virus. Across Europe, they have also caused damage to the telecommunications infrastructure, though COVID-19 was incorrectly linked to 5G.

The report examines the way false information spreads during the virus outbreak and warns that misinformation is a crisis "whose roots go much deeper and are likely to last much longer than COVID-19".

"We are experiencing a time when trust is breaking down," said David Puttnam, chairman of the committee, in a statement. "People no longer believe that they can rely on the information they receive or believe what they are told. It is absolutely corrosive to democracy."

Key recommendations include making requests to large platforms, particularly Google and Facebook, to account for their "black box" algorithms that control what content is displayed to users. Those companies who deny that their decisions in designing and training algorithms have harmed them are "simply wrong," the report said.


Currently running:
Look at that:

YouTube takes action against voter misinformation …

2:41

Companies should be mandated to audit their algorithms to show what steps they are taking to prevent discrimination, the report said. Increased transparency of digital platforms about content decisions is also proposed, so that people have a clear understanding of the rules of the online debate.

Facebook and Google did not immediately respond to the request for a comment.

Prescription: The Online Harms Bill

One of the report's key recommendations is that the UK government immediately publishes its draft Online Harms Bill. The bill would regulate digital platforms like Google and Facebook, hold them accountable for harmful content, and punish them for failing to meet their obligations.

The draft law was slow. A white paper was published in May 2019, the government's first response was published in February this year, and the full response, which was due to be published in the summer, was postponed until the end of the year.

The government was unable to confirm to the committee whether or not it would submit a bill to parliament by the end of 2021. As a result, the bill could not enter into force until the end of 2023 or even 2024, the report says. During a briefing before the report was released, Lord Puttnam described the delay as "inexcusable."

"The challenges are moving faster than the government and the gap is widening," he said. "Far from catching up, we're actually slipping back."

The report details how Ofcom, which is referred to as the online claims regulator, should be able to legally hold companies accountable. It should have the power to punish digital companies up to 4 percent of their global sales or to force ISP to block serial offenders.

Online platforms are "not inherently ungovernable," it said when they asked the government "not to wince at the inevitable and powerful lobbying of big tech."

The report specifically looks at the recent case in which Twitter decided to hide some of President Donald Trump's tweets this violated his guidelines and criticized Facebook's decision not to follow this example. Lord Puttnam said Twitter CEO Jack Dorsey has "made Facebook seriously wrong".

This story wasn't over yet, he added, but he was optimistic that Twitter's decision to take action against the president if he violated the platform's rules could have an impact.

"There is a feeling that these big companies are looking at each other, and if one makes a sensible shift in a sensible direction, the others feel very, constrained and under a lot of pressure to do the same," he said.

Throughout Europe and beyond, numerous efforts have been made to put pressure on big tech, not only to fight bad news, but also to pay more taxes and change practices through cartel decisions and data protection regulations. The success of these efforts to date has been controversial, but Lord Puttnam and other committee members ultimately expressed optimism that the tech industry would change positively.

If the government, which now has two months to respond to the report, accepts the committee's recommendations, it believes that there is a possibility that technology could help democracy and restore public confidence instead of undermining it .

LEAVE A REPLY

Please enter your comment!
Please enter your name here