(BRUSSELS) – The EU’s future Digital Services Act can set the global standards in transparency, oversight and enforcement, Facebook whistleblower Frances Haugen told MEPs at a public hearing Monday.
The special hearing had been organised by several committees of the European Parliament to examine the negative impact on users of big tech companies’ products and business models, and how EU digital rules can address these issues.
Former Facebook employee Ms Haugen said the Digital Services Act (DSA) has the potential to be a “global gold standard” and inspire other countries to “pursue new rules that would safeguard our democracies.”
She warned, however, that rules need to be strong on transparency, oversight and enforcement, otherwise “we will lose this once-in-a-generation opportunity to align the future of technology and democracy”.
Several MEPs voiced concerns at Ms Haugen’s revelations on Facebook’s practices and how they impact on users and their fundamental rights, particularly at the exploitation of children and teenagers’ mental health and on micro-targeting, including for political purposes.
Questions focused on how to make the platforms more accountable and to ensure that risk assessment and risk mitigation provisions in the proposed Digital Services Act (DSA) are strong enough to avoid abuses, polarisation, and address risks to democracy.
Members also asked Ms Haugen for her views on regulating not only illegal but also harmful content, on content moderation tools and whether targeted advertising should be banned. They also wanted to know what safeguards she would like to see included in EU digital laws, wondering if the package currently on the table was sufficient. Enforcement tools to make sure the DSA has teeth, the transparency of algorithms, giving academic researchers, NGOs and investigative journalists access to platforms’ data, were other issues addressed at the hearing.
In her replies, Ms Haugen emphasised the importance of ensuring that companies like Facebook publicly disclose data and how they collect them (on ranking content, advertising, scoring parameters for example) to allow people to make transparent decisions and prohibit “dark patterns” online. Individuals in these companies, not committees, should personally be held accountable for the decisions they make, she added.
On countering disinformation and demoting harmful content, Ms Haugen stressed that Facebook is substantially less transparent than other platforms and could do much more to make algorithms safer by setting limits on how many times content can be reshared, increasing services to support more languages, transparent risk assessment, making platforms more human-scaled and finding ways for users to moderate each other rather than being moderated by artificial intelligence. She commended lawmakers for their content-neutral approach, but warned against possible loopholes and exemptions for media organisations and trade secrets.
During her presentation, Ms Haugen also mentioned how crucial it is for governments to protect tech whistleblowers, as their testimonies will be key to protecting people from harm caused by digital technologies in the future.
The Internal Market and Consumer Protection Committee is currently discussing how the proposal on the Digital Services Act, presented by the European Commission in December 2020, should be amended and improved.
Ms Frances Haugen is a former Facebook employee specialised in Computer Engineering and, specifically, in algorithmic product management. At Facebook, Ms Haugen worked as Lead Product Manager on the Civic Misinformation team. This team looked at election interference around the world, and worked with issues related to democracy and misinformation. Facebook terminated this team after the 2020 U.S. election and Ms Haugen contacted the Wall Street Journal shortly after. Ms Haugen disclosed thousands of internal documents that she collected while working for Facebook. Some of the most striking facts backed by the leaked documents include how the use of Instagram is seriously damaging teenagers’ mental health, particularly when it comes to fostering eating and body image disorders. In general, the leaked documents show how Facebook’s public claims on a variety of topics – including, beyond mental health, Facebook’s work on hate speech and freedom of speech – often contradict internal research. Overall, Ms Haugen claims that Facebook (which owns other widely used social media companies such as Instagram) intentionally does not make these platforms safer for users because that would have an impact on their profits.
Further information, European Parliament