INSIGHT

Have your say: government opens consultation on ADM and AI regulatory frameworks

By Michael Park, Haddon Chang, Kevin Fan
Data & Privacy Digital Transformation Legal Technology Technology & Outsourcing

Positioning Australia as a leader in digital economy regulation by maximising use of ADM and AI 4 min read

The Department of the Prime Minister and Cabinet is inviting responses to its recently released issues paper (Issues Paper) to inform future digital economy regulation in the areas of automated decision making (ADM) and artificial intelligence (AI).

Key takeaways

  • As part of its Digital Economy Strategy, the Federal Government is exploring the 'safe and responsible development and deployment of new and emerging technologies' such as ADM and AI. By modernising the relevant legal frameworks, the goal is to increase uptake, investment and consumer confidence.
  • The Issues Paper highlights the pressing issues with regulating ADM and AI, and seeks industry feedback on 10 questions to identify opportunities for improvement.
  • Stakeholders in ADM and AI or organisations that may be affected by regulation changes should consider making a submission to the Digital Technology Taskforce by 22 April 2022. Following consultation, the Federal Government expects to release a discussion paper in the second half of 2022 identifying possible reforms and action.

Background

Last year, the Federal Government released Australia's Digital Economy Strategy, setting out an ambitious vision for Australia to be a top 10 digital economy by 2030. Positioning Australia as a leader in digital economy regulation is key to this vision and will keep the nation at the forefront of emerging technologies such as ADM and AI.

However, rapid development in these technologies poses challenges to existing regulatory approaches. To maximise opportunities and manage risks, the Government needs fit for purpose regulation that is agile to technological change. This means finding a balance between overregulation that risks stifling technological innovation, and a laissez-faire approach that permits unchecked and potentially unethical development of emerging technologies.

In recent years, the government and various international organisations have conducted numerous reviews of ADM and AI including the AI Ethics Framework (2019), ADM Better Practice Guide (2019), AI Action Plan (2021), Human Rights and Technology Final Report (2021) and Blueprint for Critical Technologies (2021).

Building on this body of work, the Federal Government released its Issues Paper in March 2022 to consider whether any changes are required to existing legal settings and frameworks addressing ADM and AI.

What issues are raised?

The Issues Paper provides a high-level overview of existing and potential issues in digital economy regulation that apply to ADM and AI, including:

  • Regulatory uncertainty and complexity: Existing regulatory frameworks, not designed to handle the challenges of emerging technologies, can impose complex and uncertain (and at times overlapping and inconsistent) obligations on businesses and governments, which deters innovation and growth.
  • Rapidly evolving international developments: With the EU, US, and China all publishing proposed AI regulations in 2021, Australia’s challenge is to keep pace with international regulatory developments. This is crucial, not only to ensuring global consistency and interoperability, but also to becoming a leader in digital economy regulation by 2030.
  • Public trust and confidence: Lack of public trust, confidence and understanding is a key barrier to uptake of ADM and AI. Government regulation can build these traits by identifying and addressing emerging risks, clarifying responsible parties, allocating liability and providing clear parameters for implementation and use.
  • Potential for bias and discrimination: ADM and AI algorithms can reflect the biases of their programmers, while the process of gathering data and learning from the environment may also produce social biases.
  • Transparency and explainability: The ability to understand and explain decisions or outcomes from ADM or AI can impact public trust and confidence in these technologies. Particularly in the context of government decision-making, transparency is key to promoting accountability, lawfulness and procedural fairness of administrative decisions.
  • Exercise of discretion: Decision-making in Commonwealth laws and the private sector will often require discretion and a degree of judgement to weigh up various factors. The challenge in applying discretion to ADM often results in a trade-off between efficiency and fairness.
  • Privacy: ADM and AI can have significant impacts on privacy, as data about individuals is used for inferences and decision-making in a range of contexts.

Where is there room to improve?

The Issues Paper seeks industry feedback on 10 questions and any other areas of opportunity for change and improvement.

Broadly, these opportunities include:

  • clarifying and providing additional guidance on the application of existing regulations;
  • addressing inconsistent or overlapping regulation;
  • ensuring existing and new regulations are technology neutral;
  • identifying existing and emerging risks; and
  • promoting industry and government best practices and implementation.

Have your say

Submissions on the Issues Paper are due by 22 April 2022. Following the consultation period, the government intends to release a discussion paper outlining areas of possible reform and action in the second half of 2022.

Submissions and discussions on these issues will influence the development of an overarching Digital Age Policy Framework. This is anticipated to provide principles, guidance and best practices for future digital regulation development, with further consultation expected later this year.

We recommend stakeholders in emerging technologies such as ADM and AI or organisations that may be affected by regulation changes to consider the Issues Paper and make submissions on the questions and issues raised.

The 10 questions for industry feedback
  1. What are the most significant regulatory barriers to achieving the potential offered by AI and ADM? How can those barriers be overcome?
  2. Are there specific examples of regulatory overlap or duplication that create a barrier to the adoption of AI or ADM? If so, how could that overlap or duplication be addressed?
  3. What specific regulatory changes could the Commonwealth implement to promote increased adoption of AI and ADM? What are the costs and benefits (in general terms) of any suggested policy change?
  4. Are there specific examples where regulations have limited opportunities to innovate through the adoption of AI or ADM?
  5. Are there opportunities to make regulation more technology neutral, so that it will more apply more appropriately to AI, ADM and future changes to technology?
  6. Are there actions that regulators could be taking to facilitate the adoption of AI and ADM?
  7. Is there a need for new regulation or guidance to minimise existing and emerging risks of adopting AI and ADM?
  8. Would increased automation of decision making have adverse implications for vulnerable groups? How could any adverse implications be ameliorated?
  9. Are there specific circumstances in which AI or ADM are not appropriate?
  10. Are there international policy measures, legal frameworks or proposals on AI or ADM that should be considered for adoption in Australia? Is consistency or interoperability with foreign approaches desirable?