Out of about 1.100 victims registered with EFRI more than 620 (about 57%) victims told us that they got lured into investments scams by fake or misleading ads on Facebook. This correlates with the findings of the British Action Fraud from end of May 2021, summarizing that in a 12-month period, 5,039 reports of investment fraud referenced a social media platform, with 44.7 percent of such reports relating to cryptocurrency scams. Instagram was referenced in 35.2 percent of reports, followed by Facebook, which was mentioned in 18.4 percent of investment fraud cases.
With the number of victims from online investment fraud cases rising exorbitantly during the past years, advertising revenue of the big social media platforms and online marketplaces has been exploding during the past years. Facebook in specific more than doubled its worldwide advertising revenue from USD 39 billion 2017 to more than USD 84 billion in 2020.
As a matter of fact, social media enables fraudsters to reach large numbers of people with minimum effort. It’s easy for fraudsters to make their messages look credible and often hard for consumers to tell the difference between fact and fiction. Fraudsters typically present professional and credible looking online adverts, emails, and websites which advertise fake investment opportunities in cryptocurrency, foreign exchange trading or bonds. Often, fake testimonials are accompanied with a picture of a well-known personality.
Consumer associations globally keep uncovering illegal activities online and find that the extent of Illegal content and fraudulent ads on social media platforms is overwhelming.
According to the whistleblower and former Facebook employee Frances Haugen during a US Senate subcommittee hearing on Oct. 5, 2021 and before the Committee on the Internal Market and Consumer Protection as of November 8th, 2021, Facebook definitely puts profits before safety for people thereby posing a threat to democracy and society.
So social media platforms for sure present numerous benefits but evidently also present even larger challenges for consumer protection and safety and as a matter-of-fact victims are in a very weak position when they want to hold platforms accountable for their being supportive to scammers ripping off millions of European consumers.
Civil Liability for fraud on online marketplaces
The question of whether social media companies could be held civilly liable for negligently failing to prevent fraud depend on the facts of a given case. An investigation by Buzzfeed found that social media companies very often keep the money earned from investment scam ads.
For many years now US (Section 230 of the US 1996 Telecommunications Act) and European legislation (Directive 2000/31/EC1 the “e-Commerce Directive) fostered platforms by treating “interactive computer services” from being legally liable for what users post on the platforms. One of the reasons for the law was to protect and grow the nascent internet and world wide web. A goal which was reached with digitalization shaping our daily life.
The coronavirus crisis has shown the importance of digital technologies in all aspects of modern life. It has also clearly shown the dependency of our economy and society on digital services and highlighted both the benefits and the risks stemming from the current framework for the functioning of digital services.
The „exemption of liability law” for platforms resulted in gigantic tech companies with enormous gains on the one side and an increasing mass of disinformation, speech of hate, fake news, illegal content and harmed consumers on the other side.
It is important that platforms are held accountable for the consequences of actions they have taken (or not taken). In addition, it is important, censorship measures undertaken are proven effective. Now, when consumer organizations notify platforms of illegal activities online, a response that consumer organizations often get is that platforms voluntarily put filters and human resources in place, but they did not catch these specific instances – in some cases even after repeatedly flagging the same types of listings year after year.
Very often the platforms argue that it is hard for them to identify “bad actors” as those are eager to bypass the rules and vetting process, but this excuse is just not good enough as the platforms on the other hand earn quite a lot of money from those bad actors. Any action taken by platforms must seek to be effective to protect consumers while minimizing the risks for other fundamental rights, freedoms and principles under the EU Charter of Fundamental Rights.
By recognizing the rise in online investment fraud, the establishment of subsidiary liability, particularly for online marketplaces, is needed as an incentive to ensure the scale of illegal activities on their sites and apps is severely reduced. It is just not appropriate for social media platforms to profit from illegal activities on their services.
What must be done?
So far, self- and co-regulatory initiatives that have tried to address illegal content issues have failed to effectively protect consumers and to achieve a level-playing field between businesses that try to respect the law and those that neglect it. In fact, failure to comply with voluntary commitments does not lead to legal consequences such as sanctions or consumer redress. This reality is slowly but surely also recognized by stakeholder.
Stakeholders also broadly agree on the need to upgrade the framework considering today’s challenges by establishing clear and strong obligations for service providers, harmonized across the EU.
Consumers must have effective (and proportionate) remedies, including repair, replacement, price reduction, contract termination or reimbursement of the price paid, compensation for material and immaterial damages arising from illegal content on social media websites.
More specific the ultimate solution in our opinion for reducing investment fraud are clear and strong regulatory obligations and requirements for social media companies to properly vet content (in specific investment ads) before publishing it, while also holding them liable to consumers for losses resulting from any failure to do so.
With the Digital Services Act Europe has an historic chance to set the framework for the future!
With the rules governing the provision of digital services in the EU have remained largely unchanged since the adoption of the e-Commerce Directive in 2000, on 15 December 2020, the European Commission tabled a new legislative proposal on a Digital Services Act (DSA) to update the current EU legal framework governing digital services. The proposal for a Digital Services Act (COM (2020) 825 final) is supposed to define the online legal framework for years to come.
This draft law aims to create a safer digital space in which users’ rights are protected, including rules to tackle illegal content online, enhance the accountability and transparency of algorithms, and dealing with content moderation and targeted advertising but to keep a content-neutral approach in addressing the systemic risks and harms of the business model of the big platforms.
Right now, the Internal Market and Consumer Protection Committee is discussing how the DSA proposal should be amended and improved.
It is a one-time chance, to request a powerful and clear accountability framework for online platforms regarding action against illegal activities and illegal content like investment scam ads, and on how to ensure transparency and above all on how to safeguard users/consumers` rights. Frances Haugen also warned in her testimony before the EU authorities that the Digital Services Act has to be “strong and the enforcement firm” “otherwise, we will lose this once-in-a-generation opportunity to align the future of technology and democracy.
Regarding the advertising issue the Digital Services Act proposal provides a series of due diligence obligations for online platforms. It mandates upon online platforms information requirements for targeted advertising, and additional transparency obligations on very large online platforms with regards to ads repositories and recommendation systems. It also imposes on very large online platform the duty to carry out a risk assessment, and provide mitigation measures, to safeguards rights and freedoms of their users. Finally, it attributes to EC investigation powers to monitor and ask for information on data handling and algorithmic practices. The European Parliament is considering amendments concerning recommender systems, in particular requiring consent for any profiling by such systems, strengthen data subjects’ rights to access and delete their profiles, and obtain information about the use of such profiles, prohibit misleading and manipulative algorithmic practices.
But so far, the draft for the Digital Services Act misses out finding clear words regarding the civil liability of social media platforms in case they miss out on their defined obligations. In contrast to recital 28 of the proposed Digital Services Act we request clear and strict liability obligations for illegal content as well as the imposition of general monitoring obligations or active fact-finding obligations for online marketplaces, with relation to illegal advertising content.
With Facebook whistleblower Frances Haugen challenging the European Parliament by telling them that The Digital Services Act has the potential to be a global gold standard. It can inspire other countries, incl. my own, to pursue new rules that would safeguard our democracies we identified the following necessary amendments to the drafted Act thereby assuring that the web will actually becoming a safer place with the Digital Services Act.
In specific our request for changing the Draft of the Digital Services Act includes the following amendments:
Amendment of Article 1 consumer protection must be defined as explicit legal objectives in Article 1.2, building on recital
Amendment of Article 5: In general, the DSA should establish that consumers can exercise against the intermediary service provider all the rights and remedies that would be available against the trader, including compensation for damages, repair,
replacement, price reduction, contract termination or reimbursement of the price paid in addition, specific remedies for consumers shall be foreseen in case the intermediary service provider is in breach of its own obligations listed in this Regulation.
In specific Amendment of Article 5.3. is necessary in determining that online marketplaces and traders can be jointly and severally liable,
- if not properly fulfilling their due diligence obligations.
- If failing to act upon obtaining credible evidence of illegal activities,
- for providing misleading information, guarantees, or statements.
- The benchmark of an „average and reasonably well-informed consumer” limitation must be deleted as it is not appropriate and would not ensure effective consumer protection.
Amendments of Articles 7 and 22 as well as recitals 28 and 50 are necessary in requesting online marketplaces to undertake spot checks on trader accounts and the products and services they facilitate offering.
Amendment of Article 21 requiring a request for a duty to promptly inform law enforcement or judicial authorities not only when the life or safety of individuals is threatened under criminal law, but also when online platforms become aware of other illegal activities such as fraudulent and scam ads, the sale of illegal products online.
Amendment of Article 22: Article 22.1 needs to clarify that online platforms shall only allow legitimate traders in their platforms; in UK it is proposed that only FCA registered companies can offer investment services. Platforms covered under the scope must verify that the third-country trader has a European branch or representative, which is in line with existing legislation resp. registered with an appropriate authority (e.g., market surveillance).
Amendment of Article 22.2 needs to make sure platforms conduct regular and diligent checks on traders’ legitimacy and the information they provide as soon as they receive it. Relying on self-certification by the trader will not be enough.
Amendment of Article 22: if platforms fail to meet the obligations under Article 22, they should be able to be held liable towards consumers because of their non-compliance with this DSA obligation.
Article 26.1 should contain a specific risk assessment for online marketplaces. Tackling systemic risks that online marketplaces incur when hosting offers selling unsafe products or non-compliant services should be of paramount importance.
Article 26.2 considers the risk assessment must also include any potential infringement of consumer rights by business active on the platforms and the platforms themselves, including consumer manipulation, unfair subversion or impairment of consumers’ autonomy, decision-making, or choice.
Article 27 should also include mitigation measures for online marketplaces, including random checks on the products and services they facilitate offering or promoting.
Amendment of Article 40 as the country of origin as a jurisdiction principle does not lead to efficient enforcement for consumer and
NGO complaints. In specific situations (criminal and civil cases from consumers and NGOs), authorities from destination countries should be allowed to take up action against larger platforms.
In case you share our request pls sign our petition here.
 In the United Kingdom Charles Randell, exchair of the FCA, told the Cambridge International Symposium on Economic Crime as of September 6th, 2021 that, “Google has committed to stop promoting advertisements for financial products unless an FCA-authorized firm has cleared them. Google is doing the right thing … We now need other online platforms – Facebook, Microsoft, Twitter, Tiktok – to do the right thing too. And we think that a permanent and consistent solution requires legislation.”
Mr. Randell welcomed the government’s proposals to include a limited number of financial harms in its draft Online Safety Bill, such as where a fraudster messages a victim online. However, he noted that “paid-for advertising, the main source of online investment scams, is still not covered – we consider it should be. ”