twitter disinformation, facebook google and twitter
© Tomnex

Today, the European Commission published the reports and analysis of the progress made in April 2019 by Facebook, Google and Twitter to fight disinformation

The three online platforms are signatories to the Code of Practice against disinformation and have committed to report monthly on measures taken ahead of the European Parliament elections in May 2019.

Vice-President for the Digital Single Market Andrus Ansip, Commissioner for Justice, Consumers and Gender Equality Věra Jourová, Commissioner for the Security Union Julian King, and Commissioner for the Digital Economy and Society Mariya Gabriel said in a joint statement:

“We recognise the continued progress made by Facebook, Google and Twitter on their commitments to increase transparency and protect the integrity of the upcoming elections.

“We welcome the robust measures that all three platforms have taken against manipulative behaviour on their services, including coordinated disinformation operations. They have also provided data on measures to improve the scrutiny of ad placements. However, more needs to be done to strengthen the integrity of their services, including advertising services.

“Moreover, the data provided still lacks the level of detail necessary to allow for an independent and accurate assessment of how the platforms’ policies have actually contributed to reducing the spread of disinformation in the EU.

All three signatories have now created publicly accessible political ad libraries and enabled searches through APIs, which is a clear improvement. We regret however that Google and Twitter were not able to develop and implement policies for the identification and public disclosure of issue-based ads, which can be sources of divisive public debate during elections, hence prone to disinformation.

Looking beyond the European elections, all signatories should now step up their efforts to broaden cooperation with fact checkers in all Member States as well as to empower users and the research community. In particular, online platforms need to put in practice their broader set of commitments under the Code of Practice, notably by engaging with traditional media to develop transparency and trustworthiness indicators for information sources so that users are offered a fair choice of relevant, verified information.

We also call upon platforms to cooperate closer with the research community to identify and access relevant datasets, which would enable better detection and analysis of disinformation campaigns, sound monitoring of the Code’s implementation and impact, and independent oversight of the functioning of algorithms, for the benefit of all citizens.

Finally, following statements from Microsoft that it also plans to subscribe to the Code, we encourage even wider take-up of the Code amongst the online platforms as well as advertisers and ad-network operators, so that the Code can reach its full potential.”

Main outcome of the reports:

1. Google

Google reported on additional measures taken to improve scrutiny of ad placements in the EU, including a breakdown per Member State. It noted the availability of the EU Transparency Report on political advertising and its searchable ad library, including the possibility to use Google Cloud’s BigQuery application programming interface to run customised queries. Google reported on its ongoing efforts to provide transparency around issue-based advertising, but announced that a solution would not be in place before the European elections. Global data was again provided on the removal of a significant number of YouTube channels for violation of its policies on spam, deceptive practices and scams, and impersonation.

2. Facebook

Facebook reported on measures taken in the EU against ads that violated its policies for containing low quality, disruptive, misleading or false content or trying to circumvent its systems. It started enforcing its policy on political and issue-based advertising mid-April and removing non-compliant ads from Facebook and Instagram. The April report also provided information on the opening of its elections operation centre in Dublin, which involves specialists covering all EU Member States and languages. Facebook reported of taking down a coordinated inauthentic behaviour network originating from Russia and focusing on Ukraine. The report did not state whether this network also affected users in the EU. Furthermore, Facebook reported on new access for researchers to its CrowdTangle application programming interface and its URLs Data Set.

3. Twitter

Twitter reported on ads rejected for not complying with its policies on unacceptable business practices and quality ads. It provided information on ads not served because of uncompleted certification process that is obligatory for political campaign advertisers. Twitter reported on a new election integrity policy, prohibiting specific categories of manipulative behaviour and content, such as misleading information about how to participate in the elections and voter intimidation. Twitter provided figures on measures against spam and fake accounts, but did not provide further insights on these measures, i.e. how they relate to activity in the EU.

Next steps

Today’s reports cover measures taken by online companies in April 2019. They allow the Commission to verify that effective policies to ensure integrity of the electoral processes are in place before the European elections in May 2019.

The monthly reporting agreed under the Code of Practice lasts until the European elections. The last set of reports by the platforms will be published in June. As agreed in March, EU leaders will come back to the issue of disinformation at the June European Council. The Commission’s assessment will feed into these discussions.

By the end of 2019, the Commission will carry out a comprehensive assessment of the Code’s initial 12-month period. Should the results prove unsatisfactory, the Commission may propose further measures, including of a regulatory nature.

Background

The Code of Practice has been translated into all official EU languages, which will facilitate implementation at national level, make it more accessible to all citizens and further increase its uptake.

The monthly reporting cycle builds on the Code of Practice, and is part of the Action Plan against disinformation that the European Union adopted last December to build up capabilities and strengthen cooperation between Member States and EU institutions to proactively address the threats posed by disinformation.

The reporting signatories committed to the Code of Practice in October 2018 on a voluntary basis. The Code aims to reach the objectives set out by the Commission’s Communication presented in April 2018 by setting a wide range of commitments:

  • Disrupt advertising revenue for accounts and websites misrepresenting information and provide advertisers with adequate safety tools and information about websites purveying disinformation.
  • Enable public disclosure of political advertising and make effort towards disclosing issue-based advertising.
  • Have a clear and publicly available policy on identity and online bots and take measures to close fake accounts.
  • Offer information and tools to help people make informed decisions, and facilitate access to diverse perspectives about topics of public interest, while giving prominence to reliable sources.
  • Provide privacy-compliant access to data to researchers to track and better understand the spread and impact of disinformation.

Ahead of the European elections in May 2019, the Commission is monitoring the progress of the platforms towards meeting the commitments that are most relevant and urgent ahead of the election campaign: scrutiny of ad placements; political and issue-based advertising; and integrity of services. Such monitoring is conducted in cooperation with the European Regulators Group for Audiovisual Media Services (ERGA).

The Code of Practice goes hand-in-hand with the Recommendation included in the election package announced by President Juncker in the 2018 State of the Union Address to ensure free, fair and secure European Parliament elections. The measures include greater transparency in online political advertisements and the possibility to impose sanctions for the illegal use of personal data to deliberately influence the outcome of the European elections.

Member States were also advised to set up a national election cooperation network of relevant authorities – such as electoral, cybersecurity, data protection and law enforcement authorities – and to appoint a contact point to participate in a European-level election cooperation network. The first meeting at the European level took place on 21 January 2019, the second one on 27 February, and the third on 4 April.

The Code of Practice complements also a number of measures to support media literacy as well as the creation of a network of independent fact-checkers and researchers. In particular, the Commission is funding the project Social Observatory for Disinformation and Social Media Analysis (SOMA), which provides a collaborative platform for independent European fact-checkers.

The ultimate objective of all these measures is to strengthen the role of independent civil society organisations and the users to flag disinformation threats and cooperate with the platforms for a better detection and analysis of disinformation patterns and trends.

LEAVE A REPLY

Please enter your comment!
Please enter your name here