Consumer law reform

Mandatory processes for scams, harmful apps and fake news

The ACCC recommends digital platforms be required to implement processes to prevent and remove scams, harmful apps and fake reviews on the platforms’ services and considers such measures should apply, at a minimum, to:

  • search, social media, online private messaging, app store, online retail marketplace and digital advertising services, in respect of scams;
  • app stores in respect of harmful apps; and
  • search, social media, app stores, online retail marketplace and digital advertising services, in respect of fake reviews.

Harms observed by the ACCC

The ACCC has found that digital platform services of all sizes have engaged in conduct that has resulted in a wide range of harms to consumer confidence, including through increased exposure to:

  • online scams, including phishing, fraudulent webpages and fake social media accounts for businesses or individuals;
  • malicious apps, including where apps target children but contain age-inappropriate content, or apps which present fraudulent representations to consumers to the benefit of the app developer, despite app store operators' app-review processes;
  • dark patterns, including choice frameworks that confuse, increase difficulty or manipulate choices made by consumers;
  • fake reviews, including where reviews are written to be deliberately misleading, where they are purchased or incentivised without the knowledge of consumers, or where negative reviews are moderated and withheld from consumers;
  • inadequate identity verification, often linked to the posting of fake reviews; and
  • online tracking, which has led to a reduction in consumer privacy and data security.

Digital platforms, cybercriminals and otherwise dishonest conduct can exploit user vulnerabilities, power imbalances which exist between users and digital platforms, and poor protections against this conduct. While scams occur throughout the economy, some digital platform services have characteristics that make them an attractive mode of contact for scammers—specifically, the ability to reach many victims at low cost and the ability to target consumers based on specific vulnerabilities.

Inadequate processes observed by the ACCC

The ACCC is particularly concerned about its findings that:

  • platforms have at times failed to remove scams, harmful apps and fake reviews when notified;
  • scammers continue to proliferate fraudulent pages on digital platforms;
  • digital platforms continue to host insufficiently vetted ads that direct consumers to investment scams;
  • many platforms do not inform consumers about whether they have measures to check or verify the legitimacy of reviews and if so, what those measures are; and
  • digital platforms’ voluntary transparency reports do not allow consumer advocacy groups or regulators to effectively evaluate their consumer protection strategies or provide sufficient accountability to users.

ACCC's recommendations

The ACCC considers these harms and inadequate processes warrant new digital platform-specific regulation to reduce scams, harmful apps and fake reviews. It recommends digital platforms be required to implement mandatory processes to prevent and remove scams, harmful apps, and fake reviews on the platforms’ services, including:

  • A notice-and-action mechanism with the following elements:
    • Notice: platforms must adopt user-friendly mechanisms for users to report scams, harmful apps, or suspected review manipulation;
    • Action: platforms must promptly respond by, for example, removing content or providing advice on why something is permitted;
    • Communication: platforms must promptly notify the reporting person and potentially affected consumers of processes and actions undertaken;
    • Information sharing: platforms must promptly share information about identified issues with other platforms and relevant agencies to aid consumer protection efforts; and
    • Redress: platforms should be required to provide redress to users who have been harmed by the platform failing to meet its obligations under these measures.
  • Verification of certain business users, including advertisers, app developers and merchants. For example, a digital platform that hosts ads should be required to obtain identifying documentation and business details from prospective advertisers, and take steps to verify these documents, before hosting paid promotions.
  • Additional verification of advertisers of financial services and products (including crypto-assets). At a minimum, this should require platforms to check that a prospective advertiser of financial products and services holds an appropriate licence from the Australian Securities and Investments Commission (ASIC).
  • Improved review verification disclosures where platforms show reviews and ratings of products or services, including information about what steps the digital platform takes to help ensure reviews are legitimate or clearly disclose where no such steps have been taken.
  • Public and comprehensive reporting on mitigation efforts, including with the regular given powers to specify mandatory information for inclusion in public reports and be able to request that certain detailed information be provided confidentially.

All firms providing the relevant platform services in Australia would be required to meet this minimum standard. However, at the minimum, these mandatory processes would apply to:

  • Search, social media, online private messaging, app store, online retail marketplace and digital advertising services, in respect of scams;
  • App stores in respect of harmful apps; and
  • Search, social media, app stores, online retail marketplace and digital advertising services, in respect of fake reviews.

New obligations for digital platforms would apply in addition to the existing (and any future) general provisions of the ACL and should be designed to address specific issues that are not efficiently and effectively addressed under existing economy-wide legislation.

Potential implications

  • Mandatory processes and public reporting are likely to be significant for firms providing the relevant platform services in Australia that could be required to meet this minimum standard if implemented. However, the ACCC considers that the compliance burden of the obligations would be proportionate to a platform’s size, and additional measures should build on existing processes that digital platforms have in place to establish a common minimum standard of protection for consumers and business users across different platforms.
  • Combined with the information-sharing aspect of the notice-and-action recommendation, the identity verification recommendation intends to reduce the capacity of unscrupulous actors to proliferate scams or harmful apps across platforms.
  • The identity verification recommendation intends to reduce scammers’ capacity to appropriate the identities of public figures to mislead consumers.
  • Verification of the legitimacy of financial advertisers is intended to reduce the proliferation of investment scams on digital platforms.