The European Commission published on 29 January the first reports submitted by signatories of the Code of Practice against disinformation signed in October 2018.
Advertisement
What do the reports published cover?
As part of the Action Plan against disinformation that the European Commission and the High Representative presented last December, the online platforms and the advertising sector that have signed the Code of Practice on Disinformation, have been required to provide reports including up-to-date and complete information on the actions they took in 2018 to comply with their commitments. These reports were published today by the Commission on its website, in parallel with the conference on “Countering Online Disinformation” organised by the Commission in Brussels.
The Code of Practice is a set of industry self-regulatory commitments to fight disinformation on a voluntary basis. Signatories of the Code have committed to taking precise, measurable and concrete measures to fight online disinformation.
How will the Code of Practice contribute to tackling disinformation before the European elections in May?
By implementing the commitments set out in the Code, the signatories will increase transparency for European citizens about political advertising and will limit techniques such as the malicious use of bots and fake accounts.
Through the Code, the signatories have committed to help counter mass online disinformation campaigns intended to polarise public opinion or sow distrust in the European institutions, especially in relation to national elections in Member States and the European Parliament elections.
Can you describe the online companies’ first concrete actions?
The four online companies that are signatories of the Code of Practice have taken or are taking measures to help them meet their commitments. However, complying with all the commitments in the Code of Practice still requires significant efforts.
Work is more advanced and comprehensive in some areas, for instance in taking down fake accounts, transparency of political ads or on de-monetising some purveyors of disinformation, and less in others. These are of key concern for elections across the EU. However, more significant progress is required in key parts of the Code, such as transparency of issue-based ads or the operational cooperation with fact-checkers and the research community. The availability of consumer empowerment tools seems to be limited only to a number of Member States.
For instance, Facebook implements measures to make political advertising more transparent, take action against fake accounts and malicious automated systems, provide users with contextual information, tools and support to empower them in their online experience and encourage research into disinformation. Facebook’s political ads transparency system will be available across the European Union in advance of the EU elections. In its report, the social network also provides insights on a number of tools to help consumers make decisions when they encounter online news that may be false or to make it easier to find diverse information. However, some consumer empowerment tools such as the context button or the cooperation with fact-checkers are not yet available throughout the EU, and more clarity on deployment plans across the EU would be welcome. Actions to support research also appear to be of limited scope and implementation of research activities has not started yet.
Google‘s work is making progress with regard to the scrutiny of ads placements through its Ad Sense network, the fight against fake accounts and impostor websites, and the transparency of political ads, which will be rolled out in advance of the May 2019 European elections and will include an Election Ads Transparency Report. Google has also taken steps to provide users with information, tools and support to empower them in their online experience: its ranking algorithms prioritise relevant, authentic and authoritative information in the search results, and the Fact Check label is available to users in all EU 28 Member States. However, other tools that may help improve users online experience, such as Breaking News and Top News are available only in a small number of Member States and more clarity about future deployment plans is needed. Google’s actions to support research are developing, including through its participation in the Trust Project and the Credibility Coalition, but are still of limited scope and should be broadened to a wider research community.
Twitter has prioritised new measures designed to act against malicious actors harnessing the vulnerabilities of its services, in particular the closure of fake or suspicious accounts and automated systems/bots used to spam or increase the distribution of inauthentic content and disinformation. The social network provides some illustrative data, which suggest significant progress on this front. On the other hand, its report does not sufficiently discuss how its advertising policies restrict persistent purveyors of disinformation from promoting their tweets and achieving prominence in user timelines.
Mozilla is about to launch an upgraded version of its browser to block cross-site tracking by default. This will limit the information revealed about users’ browsing activity, which may be harnessed in support of disinformation campaigns. The upgraded browser will be available across the EU. The Firefox EU Election Promo will promote transparency of political advertising. Its availability across the EU will depend in part on Mozilla’s ability to localise features by Member State. It would be useful for Mozilla to provide further details on the features, availability and timing for the rollout of the Firefox EU Election Promo.
In conclusion, the Commission expects the companies to put in place a more systematic approach that will allow compliance with their commitments under the Code to be properly monitored and assessed on the basis of appropriate performance data. The companies have committed to continue working on these issues. The Commission will closely follow their progress and expects them to deliver substantially before the European Parliament elections.
What about the progress made by the advertising sector?
The Commission welcomes the efforts of the trade associations representing the advertising sector to raise awareness about the Code and promote its uptake among their respective memberships. The Commission notes that four national associations have now subscribed to the Code.
The Commission notes, however, the absence of corporate signatories and stresses the important role brands and advertisers play in the efforts to demonetise purveyors of disinformation. The Commission therefore expects brands and advertisers to step up and commit to the effort to counter disinformation, in particular ahead of the 2019 EU elections. The Commission will remain in contact with the trade association signatories, which will provide aggregated reporting in September 2019.
Can you provide examples of what will actually change with the implementation of the Code of Practice?
With this year’s European elections in prospect, online political advertising distributed through social media should be clearly marked as such, and should be distinguishable from other types of sponsored content on social networks. The entity that has paid for the advertisement on the social media should also be identified.
There should also be a reduction in the number of fake websites: i.e. sites that are designed to look like those of a particular media outlet or legitimate political candidate and whose intention is to promote disinformation.
The Code should also contribute to a reduction of fake accounts that manipulate public opinion by spreading and amplifying disinformation.
Consumers should also be able to easily identify and report information they receive as disinformation, and platforms will take action to reduce the visibility and dissemination of this content.
Will the Code of Practice be enough?
Disinformation is a very complex issue that requires a comprehensive and inclusive approach. A single solution cannot address all challenges related to disinformation.
The Code is just one element of the toolbox proposed by the Commission on 26 April 2018 in its Communication on tackling online disinformation.
The Communication includes other relevant actions, such as the creation of an independent European network of fact-checkers and academic researchers, the use of new technological tools to detect, report and counter false information, and measures to support quality journalism. The aim is also to empower citizens, notably by promoting initiatives on media literacy such as the Media Literacy Week starting on 18 March and by making sure that Member States also promote similar programmes.
As part of the Action Plan presented by the Commission and the High Representative in December 2018, the Strategic Communication Task Forces and the EU Hybrid Fusion Cell in the European External Action Service (EEAS), as well as EU delegations in the neighbourhood countries are being reinforced with additional specialised staff and data analysis tools. The EEAS’ strategic communication budget for addressing disinformation and raising awareness about its adverse impact has significantly increased, from 1.9 million in 2018 to 5 million in 2019. EU Member States should complement these measures by reinforcing their own means to deal with disinformation and participate in the Rapid Alert System that is being set up (more details below).
How does this Code relate to the measures linked with the European elections announced by President Juncker in his State of the Union speech in September 2018?
The Recommendation included in the election package is addressed to Member States and European and national political parties, foundations and campaign organisations.
The Code of Practice sets out self-regulatory commitments for online companies and the advertising industry to fight disinformation on a voluntary basis.
The Code of Practice and the Recommendation go hand-in-hand. European and national political parties would be required to make available on their websites the same sort of information on political advertising that platforms would be committed to make available for online political ads distributed over their services.
What is the European cooperation election network and who is a member?
The European election cooperation network brings together national election authorities, audiovisual media regulators, cybersecurity and data protection authorities as well as relevant expert groups, for example on media literacy. The European election network was convened for the first time in January 2019. The outcome of the work of the Rapid Alert System will be shared with the European cooperation election network, to exchange information on threats relevant to elections and support the possible application of sanctions.
What does the Code of Practice set out as next steps?
Today’s reports are just a first step. Ahead of the European Parliament elections, online platforms and the advertising sector have committed to providing regular and complete information on a monthly basis about how they are implementing the commitments to which they subscribed in the Code of Practice, including by replying to the Commission’s specific requests starting in January 2019. This information will be made public.
In addition, the Code of Practice requires that signatories provide a full report after twelve months. These reports should include complete data and information to enable a thorough assessment by the Commission. On this basis, the Commission, assisted by independent expertise and with the help of the European Regulators Group for Audiovisual Media Services (ERGA), will assess the overall effectiveness of the Code of Practice. The Commission may also seek the assistance of the European Audiovisual Observatory as well as leading international researchers and experts.
What is the Rapid Alert System and how will it work?
As part of the Action Plan against disinformation presented by the European Commission and the High Representative in December 2018, the Rapid Alert System will be a hub for Member States, EU institutions and partners to share information on ongoing disinformation campaigns and allow them coordinate their responses. The Rapid Alert System embodies the European approach, in that its purpose is to protect fundamental freedoms and open, democratic debate.
The system will be based on open-source and unclassified information only. As the Rapid Alert System should be set up by March 2019, Member States are currently working urgently to designate national contact points, map their capacities and draw up collective workflows.
What is the role of the European network of fact-checkers and researchers in tackling online disinformation, and when will it be launched?
The role of fact-checkers is essential in tackling disinformation. Their work contributes to make the information ecosystem more robust by verifying and assessing the veracity of content based on facts and evidence. The Commission’s aim is to facilitate cooperation between European fact-checkers through the creation of a network of European fact-checkers.
The network will gather fact-checkers operating on the basis of high standards and will be editorially independent.
The Commission supports a project for a Social Observatory for Disinformation and Social Media Analysis (SOMA), with EUR 1 million, which started its work in November 2018. It is developing a platform that has become operational on the 1 November 2018 and will facilitate cooperation amongst fact-checkers in view of the European elections
In March, this project organises a meeting with European fact-checkers to foster cooperation ahead of the European elections.
The Commission will also provide additional funding for the platform (2.5 million under CEF) which, building on the experience learnt with SOMA, will scale up the joint work between fact-checkers and researchers and provide additional tools for fact-checking and network analysis.
This digital service infrastructure should scale up the collaboration between fact-checkers and academic researchers in order to ensure full coverage of the Union territory and facilitate the build-up and interconnection of relevant national organisations.
Meanwhile, the Horizon 2020 support action SOMA (Social Observatory for Disinformation and Social Media Analysis) is providing a platform in order to create a multidisciplinary community, including fact-checkers and academic researchers, to enhance detection as well as analytical capabilities and better understand various types of disinformation threats.
What is the Commission doing to support media?
The Commission supports quality news media and journalism as an essential element of a democratic society. As confirmed in the progress report of December 2018, the Commission wants to enhance the transparency and predictability of State Aid rules for the media sector; it also launched a call of about 1.9 million for the production and dissemination of quality news content.
The Commission co-funds, together with initiatives of the European Parliament, independent projects in the field of media freedom and pluralism. These projects, among other actions, monitor risks to media pluralism across Europe, map violations to media freedom, fund cross-border investigative journalism and support journalists under threat. New calls for projects are expected in the coming weeks.
To support quality journalism, media freedom, media literacy and media pluralism, the Commission proposed a dedicated budget of 61 million in the 2021-2027 Creative Europe programme.
In addition, in its proposal for Horizon Europe programme (2021-2027), the Commission has foreseen funding for the development of new tools to combat online disinformation; to better understand the role of journalistic standards and user-generated content; and to support next generation internet applications and services including immersive and trustworthy media, social media and social networking. So far around 40 million have been invested in EU projects in the area.
Source: European Commission