A historic chance to reduce illegal content on social media platforms!

illegal content on social media platfroms

A historic chance to reduce illegal content on social media platforms!

Out of about 1.100 victims registered with EFRI, more than 620 (about 57%) victims told us that they got lured into investments scams by fake or misleading ads on social media platforms like Facebook or Instagram. These findings correlate with the British Action Fraud findings from the report dated end of May 2021[1],  summarizing that in 12 months, 5,039 reports of investment fraud referenced a social media platform, with 44.7 per cent of such statements relating to cryptocurrency scams. Instagram was referenced in 35.2 per cent of reports, followed by Facebook. So illegal content on social media platforms has to be reduced!

With the number of victims from online investment fraud cases rising exorbitantly during the past years, the advertising revenue of the big social media platforms and online marketplaces has been exploding during the same period. Specifically, Facebook doubled its worldwide advertising revenue from USD 39 billion in 2017 to more than USD 84 billion in 2020.

Social media enables fraudsters to reach large numbers of people with minimum effort. It’s easy for fraudsters to make their messages look credible and often hard for consumers to tell the difference between fact and fiction. Fraudsters typically present professional and credible-looking online adverts, emails, and websites that advertise fake investment opportunities in cryptocurrency, foreign exchange trading, or bonds. Often, fake testimonials come with pictures of well-known personalities.

Consumer associations globally keep uncovering illegal activities online and find that the extent of Illegal content and fraudulent ads on social media platforms is overwhelming.

According to the whistleblower and former Facebook employee Frances Haugen during a US Senate subcommittee hearing on Oct. 5, 2021, and before the Committee on the Internal Market and Consumer Protection as of Nov. 8., 2021, Facebook puts profits before safety for people, thereby posing a threat to democracy and society.

Social media platforms offer numerous benefits. But social media presents even more significant challenges for consumer protection and safety. Victims are in a weak position when they want to hold platforms accountable for being supportive of scammers ripping off millions of European consumers.

 

Civil liability for fraud on online marketplaces

The question of whether social media companies could be held civilly liable for negligently failing to prevent fraud depends on the facts of a given case. An investigation by Buzzfeed found that social media companies very often keep the money earned from investment scam ads.

For many years now, US (Section 230 of the US 1996 Telecommunications Act) and European legislation (Directive 2000/31/EC1 the “e-Commerce Directive) fostered platforms by treating “interactive computer services” from being legally liable for what users post on the platforms. One of the reasons for the law was to protect and grow the nascent internet and world wide web. A goal that was reached with digitalization shaping our daily life.

The coronavirus crisis has shown the importance of digital technologies in all aspects of modern life. It has also clearly demonstrated the dependency of our economy and society on digital services and highlighted both the benefits and the risks stemming from the current framework for the functioning of digital services.

The „exemption of liability law” for platforms resulted in gigantic tech companies with enormous gains on the one side and an increasing mass of disinformation, the speech of hate, fake news, illegal content, and harmed consumers on the other side.

Platforms must be held accountable for the consequences of actions they have taken (or not taken). In addition, it is essential censorship measures undertaken are proven effective. Now, when consumer organizations notify platforms of illegal activities online, a response that consumer organizations often get is that media voluntarily put filters and human resources in place. Still, they did not catch these specific instances – in some cases even after repeatedly flagging the same types of listings year after year.

Very often, the platforms argue that it is hard for them to identify “bad actors” as those are eager to bypass the rules and vetting process, but this excuse is not good enough as the platforms, on the other hand, earn quite a lot of money from those bad actors. Any action taken by social media must seek to protect consumers while minimizing the risks for other fundamental rights, freedoms, and principles under the EU Charter of Fundamental Rights.

By recognizing the rise in online investment fraud, the establishment of subsidiary liability, particularly for online marketplaces, is needed as an incentive to ensure the scale of illegal activities on their sites and apps gets reduced. It is just not appropriate for social media platforms to profit from unlawful activities on their services.

 

Consumers have to be protected.

So far, self-and co-regulatory initiatives that have tried to address illegal content issues have failed to protect consumers effectively and achieve a level-playing field between businesses that try to respect the law and those that neglect it. Failure to comply with voluntary commitments does not lead to legal consequences such as sanctions or consumer redress. This reality is slowly but surely also recognized by stakeholders.

Stakeholders also broadly agree on the need to upgrade the framework considering today’s challenges by establishing clear and robust obligations for service providers harmonized across the EU.

Consumers must have effective (and proportionate) remedies, including repair, replacement, price reduction, contract termination or reimbursement of the price paid, compensation for material, and immaterial damages arising from illegal content on social media websites.

The Digital Services Act must outline regulatory obligations and requirements for social media companies addressing investment fraud. Before publishing, properly vetting content (in specific investment ads) must become a legal requirement for social media companies.

 

Europe has a historic chance to set the framework for the future!

 

The rules governing the provision of digital services in the EU have remained essentially unchanged since adopting the e-Commerce Directive in 2000, on Dec. 15. 15 2020, the European Commission tabled a new legislative proposal on a Digital Services Act (DSA) to update the current EU legal framework governing digital services. The Digital Services Act (COM (2020) 825 final) proposal defines the online legal framework for years to come.

The proposal aims to create a safer digital space. Users’ rights are protected, including rules to tackle illegal content online, enhance the accountability and transparency of algorithms, and deal with content moderation and targeted advertising. The proposal still keeps a content-neutral approach in addressing the systemic risks and harms of the business model of the big platforms.

Right now, the Internal Market and Consumer Protection Committee is discussing how the DSA proposal should be amended and improved.

It is a one-time chance to request a powerful and clear accountability framework for online platforms regarding action against illegal activities and illegal content like investment scam ads, ensuring transparency, and above all, safeguarding users/consumers` rights. In her testimony before the EU authorities, Frances Haugen also warned that the Digital Services Act has to be “strong and the enforcement firm” “otherwise, we will lose this once-in-a-generation opportunity to align the future of technology and democracy.

The Digital Services Act proposal provides a series of due diligence obligations for online platforms regarding the advertising issue. It mandates information requirements for targeted advertising and additional transparency obligations on extensive online platforms regarding ads repositories and recommendation systems. It also imposes on the comprehensive online platforms the duty to carry out a risk assessment and provide mitigation measures to safeguard the rights and freedoms of their users. Finally, it attributes EC investigation powers to monitor and ask for information on data handling and algorithmic practices. The European Parliament is considering amendments concerning recommender systems, requiring consent for any profiling by such methods, strengthening data subjects’ rights to access and delete their profiles, obtaining information about the use of such profiles, and prohibiting misleading and manipulative algorithmic practices.

But so far, the draft for the Digital Services Act misses out on finding clear words regarding the civil liability of social media platforms when the platforms miss out on their defined obligations. In contrast to recital 28 of the proposed Digital Services Act, we request clear and strict liability obligations for illegal content as well as the imposition of general monitoring obligations or active fact-finding obligations for online marketplaces with relation to unlawful advertising content.

Facebook whistleblower Frances Haugen challenged the European Parliament by telling them that The Digital Services Act can be a global gold standard. It can inspire other countries, incl. My own, to pursue new rules that would safeguard our democracies, we identified the following necessary amendments to the drafted Act, thereby assuring that the web will become a safer place with the Digital Services Act.

 

In specific, our request for changing the Draft of the Digital Services Act includes the following amendments:

Amendment of Article 1 consumer protection must be defined as explicit legal objectives in Article 1.2, building on recital
34.

Amendment of Article 5: In general, the DSA should establish that consumers can exercise against the intermediary service provider all the rights and remedies that would be available against the trader, including compensation for damages, repair,
replacement, price reduction, contract termination, or reimbursement of the price paid; in addition, specific remedies for consumers shall be foreseen in case the intermediary service provider is in breach of its obligations listed in this Regulation.

In specific Amendment of Article 5.3. is necessary for determining that online marketplaces and traders can be jointly and severally liable,

  • if not adequately fulfilling their due diligence obligations.
  • If failing to act upon obtaining credible evidence of illegal activities,
  • for providing misleading information, guarantees, or statements.
  • The benchmark of an „average and reasonably well-informed consumer” limitation must be deleted as it is not appropriate and would not ensure adequate consumer protection.

Amendments of Articles 7 and 22 and recitals 28 and 50 are necessary for requesting online marketplaces to undertake spot checks on trader accounts and the products and services they facilitate offering.

Article 21 requires a request for a duty to promptly inform law enforcement or judicial authorities when the life or safety of individuals is threatened under criminal law. The request should also include instances when online platforms become aware of other illegal activities such as fraud and scams ad the sale of illicit products online.

Amendment of Article 22: Article 22.1 needs to clarify that online platforms shall only allow legitimate traders in their media; in the UK, it is proposed that only FCA registered companies can offer investment services. Platforms covered under the scope must verify that the third-country trader has a European branch or representative, in line with existing legislation resp. Registered with appropriate authority (e.g., market surveillance)[1].

Amendment of Article 22.2 needs to make sure platforms conduct regular and diligent checks on traders’ legitimacy and the information they provide as soon as they receive it. Relying on self-certification by the trader will not be enough.

Amendment of Article 22: if platforms fail to meet the obligations under Article 22, they should be held liable towards consumers because of their non-compliance with this DSA obligation.

Article 26.1 should contain a specific risk assessment for online marketplaces. Tackling systemic risks that online marketplaces incur when hosting offers selling unsafe products or non-compliant services should be of paramount importance.

Article 26.2 considers the risk assessment must also include any potential infringement of consumer rights by businesses active on the platforms and the platforms themselves, including consumer manipulation, unfair subversion or impairment of consumers’ autonomy, decision-making, or choice.

Article 27 should also include mitigation measures for online marketplaces, including random checks on the products and services they facilitate offering or promoting.

Amendment of Article 40 as the country of origin as a jurisdiction principle does not lead to efficient enforcement for consumer and
NGO complaints. In specific situations (criminal and civil cases from consumers and NGOs), authorities from destination countries should be allowed to take up action against larger platforms.

In case you share our request pls, sign our petition here.

[1] In the United Kingdom, Charles Randell, ex-chair of the FCA, told the Cambridge International Symposium on Economic Crime[1] as of Sept. 6., 2021, “Google has committed to stop promoting advertisements for financial products unless an FCA-authorized firm has cleared them. Google is doing the right thing. We now need other online platforms – Facebook, Microsoft, Twitter, Tiktok – to do the right thing. And we think that a permanent and consistent solution requires legislation.”

Mr Randell welcomed the government’s proposals to include a limited number of financial harms in its draft Online Safety Bill, such as where a fraudster messages a victim online. However, he noted that “paid-for advertising, the main source of online investment scams, is still not covered – we consider it should be. ”