Research Streams

Stream Leader Monika Zalnieriute

‘AI Decision Making and the Law’ research stream explores the relationship between various automation and inference techniques, popularly known as AI, and increasing technologisation of decision-making in governments, courts, corporations and society.

Led by Dr. Monika Zalnieriute, AI Decision Making and the Law research stream explores the relationship between various automation and interference techniques, popularly known as AI, and increasing technologisation of decision-making in governments, courts, corporations and society.

In this stream, Monika leads many cutting-edge projects, with direct relevance and impact on policy making in Australia and internationally. The first project of this stream is Monika’s Discovery Early Career Award ‘AI Decision-Making, Data Privacy and Discrimination Laws, funded by the Australian Research Council (2021-2024), which draws on mixed method research design to question the effectiveness discrimination and data privacy laws in preventing AI discrimination and further isolation of historically discriminated groups, such as LGBTI, women, and racialized communities. 

The stream is also working with researchers at the Law Institute of Lithuania on research project Government Use of Facial Recognition Technologies: Legal Challenges and Solutions’, funded by the Lithuanian Research Council, 2021 – 2023, $ 232,450 (EUR 150,000)). Collaborating with the project partners London School of Economics (UK) and Georgia Institute of Technology (USA), the project will look at the ways government are deploying automated facial recognition technologies to assist decision-making in Lithuania, USA, UK, Russia and Germany. 

The AI Decision Making and the Law stream is also working with the Australiasian Institute of Judicial Administration (AIJA) and, together with the Hub Director Lyria Bennett Moses and FLIP researchers Michael Legg and Felicity Bell, was awarded a research grant by AIJA ‘AI Decision-Making and the Courts’ to prepare a guide for AIJA members and judges in the Asia-Pacific region, which will set out the key challenges and opportunities that automated decision-making present for courts and judges.

The AI Decision Making and the Law is part of the $71 million ARC Centre of Excellence for ‘Automated Decision-Making and Society’ ($31.8 million awarded by ARC, $39 committed by the partners), where Monika is an Associate Investigator. Launched in 2020, the Centre will run for 7 years to create the knowledge and strategies necessary for responsible, ethical, and inclusive automated decision-making. 

The Centre combines social and technological disciplines in an international industry, research and civil society network that brings together experts from Australia, Europe, Asia and America. If you are interested in collaborating with AI Decision Making and the Law, please get in touch with the Stream Lead Monika

Co Stream Leads Danielle Hynes and Bronwyn Miller

Data justice is increasingly recognised as an important aspect of the datafication and automation of decision making. Yet the meaning of data justice is often context dependent. Clarifying both how data justice arises as problem in need of addressing and how it can be defined, measured, and delivered requires an in-depth investigation of the operation and consequences of a variety of data analytic tools in different social and organisational settings. While regulation is an important strategy for seeking just outcomes, an appreciation of the dynamics of technological change in specific application settings is crucial to the design of regulation and its effectiveness.

The stream is already engaging with academic staff from ADA, Computer Science and Engineering, Faculty of Medicine, civil society organisations and government organisations.

Stream Leader Kayleen Manwaring

The last three decades has seen substantial development and commercial and consumer use of previously unconventional forms of distributed information technologies, where sensors and microprocessors with internetworking capabilities are embedded in everyday objects and environments not previously computerised, such as cars, fridges, people and animals. The growth in the use of cyber-physical systems and Internet-enhanced objects has already brought about significant sociotechnical change, and this is unlikely to come to an end any time soon. Cyber-physical systems and connected devices have become essential in industries from manufacturing and healthcare to agriculture and environmental management, to smart homes and cities. This change brings with it some significant benefits for society, particular in the areas of: assisting those with disabilities live more independent lives; reducing traffic congestion; improving waste management, urban safety and bushfire control; increasing availability of remote healthcare and education; and supporting more efficient and sustainable infrastructure, transport, agriculture and industry.

However, while cyber-physical objects and systems may lead to benefits in our daily lives, they also expose individuals and societies to a number of risks, ranging from disclosure of private information, unwanted surveillance of adults and children by the state and by corporate interests, physical injury, harassment and stalking, safety and other defects, exacerbation of existing inequalities by a growing ‘digital divide’, discrimination and barriers to the right to freedom of expression. Alongside this sociotechnical change has come the potential for ‘regulatory disconnection’, that is, where existing regulatory frameworks become disconnected from societal expectations due to the new things, behaviours and relationships made possible by these technologies.

This stream is designed to investigate some of the potential areas of regulatory disconnection. This has now become a matter of some urgency in Australia and globally, as shown in the concern about legal and regulatory issues displayed in policy documents such as the World Economic Forum State of the Connected World report 2020, and the Chief Scientist-commissioned report by the Australian Council of Learned Academies on the Internet of Things, also published in late 2020.

The stream’s objective is to develop an Australian centre of excellence around legal challenges for cyber-physical systems and connected devices. While they share many issues with conventional information technologies, such as data protection, the ‘physical’ element brings into play additional issues. To solve issues of regulatory disconnection, we need to investigate areas of law not traditionally associated with information technology, such as tort, product liability, rights to repair, land law, personal property and insolvency.

Stream Leader: Ross Buckley

Technology creates incredible potential for the development of finance, including (i) disintermediation of traditional methods of delivery of financial services, (ii) lower barriers to entry, (iii) more efficient and affordable financial services, and (iv) delocalisation of financial products. Innovation in financial services is occurring at an increasing pace, and regulators are struggling to keep up.

Technological boom in finance creates both opportunities (eg financial inclusion in developing countries with large numbers of unbanked people and automation of routine processes, such as fraud detection and prevention) and challenges (eg absence of legal certainty as to how new developments fit within the existing regulatory framework, lack of understanding of the new technologies by consumers promoting uninformed decision-making, increased cybercrime risks, anonymity of transactions and resulting money laundering implications).

The FinTech Stream seeks to explore the impact of technological disruption on financial markets in Australia and across the globe, by analysing existing challenges, creating a discussion platform to exchange ideas among various stakeholders, and developing workable solutions. The research will examine the legal implications of a variety of technology-driven developments in finance, including (but not limited to):

  • Existing practices and regulation of Initial Coin Offerings (ICOs) – a financing model at the intersection of blockchain and crowdfunding;
  • Application of RegTech (ie use of technology for regulatory monitoring, reporting and compliance) in finance;
  • Evolution of decentralised virtual currencies and emergence of government or central bank backed cryptocurrencies;
  • Funds and financial technology;
  • Emergence of data-driven finance models;
  • Use of artificial intelligence and algorithmic decision-making models in finance.

Stream Leader Bronwen Morgan and Zsofia Korosy

Green technology (“green tech”) is a broad term describing the recent emergence of products that seek to protect and/or repair the environment and foster more sustainable relationships between nature and society. Examples include technologies for renewable energies, marine and terrestrial transport, waste recycling, building construction and management, water, air and soil purification, food production and natural resources conservation. The use of green technologies occurs across a wide range of diverse economic sectors (e.g. agriculture, fishing, energy, transportation, construction, water management, resource extraction, and industrial production), with each sector having differing regulatory environments, capacities, capital and investment needs. These conditions have accordingly produced a complex array of green tech applicable laws and regulations, embracing a mix of government intervention, self-regulation, and a range of related laws and policies that create supportive/unsupportive investment environments (e.g macroeconomic policies, and proposals for green new deals), or impose restrictions on the impacts and development of technologies. There is a need to better understand and map this web of green tech law and policy , examining  differences across sectors, while also identifying common features and ways in which externalities and social costs can be addressed, while allowing for learning, support and facilitation of the burgeoning green tech industry. This will be a focus of the green tech stream’s work.

Key questions will include:

  • What is meant by the term  “green tech” across different sectors (e.g. oceans, water, shipping, energy, city planning, conservation)?
  • What are the main legal and regulatory approaches across these sectors? Where and how do they interact?
  • What are common regulatory innovations and blind spots?
  • How can the regulation of green tech be improved to ensure more sustainable and just outcomes  for society?

The green tech stream has published Green technology and regulatory settings in Australia: an overview and discussion paper. The paper considers four green technologies and investigates their potential to deliver carbon abatement and other environmental benefits. These four technologies are:

  • Electric vehicles;
  • Renewable energy storage systems, particularly batteries;
  • Hydrogen; and
  • Waste to energy.

The driving purpose of this discussion paper is to evaluate the regulatory settings related to each technology and initiate essential further discourse. To this end, alongside the achievement of carbon abatement, the paper proposes three principles that should inform any regulatory approach:

  • Energy justice;
  • Just access to space and mobility; and
  • Respect for the totality of ecological limits.

Stream Leader Marc De Leeuw  and the stream is supported by Research Assistant Simon Taylor

New technologies rapidly blur the distinction between people and things. This stream seeks to understand the changing role for legal and regulatory frameworks in response to shifts in legal personalities driven by 21st century technological, bioscientific and economic developments. In particular, it focuses on the emergence of robotic technologies and the challenges this brings to the question of legal personhood. Artificial intelligence, driverless cars, care robots, and synthetically created life forms increasingly undermine the standard binary of organic and inorganic life, giving rise to questions of legal responsibility, ownership over hybrid entities, and the beginning and end of human or artificial life forms. 

This stream tackles these issues and engages with the empirical and theoretical problems of disruption that cyber-physical systems bring to nation-states and their legal institutions.  

Specifics on the stream's 2021 Robotics Project WIP can be found here on the Hub site:

Stream Leader Bronwen Morgan

To continue to support the work of Regen Sydney and leverage that work to explore key issues at the intersection of law, technology and regenerative economic development. Regen Sydney is  a network of organisations & individuals across Greater Sydney exploring a regenerative future and connecting across silos and sectors to reimagine a new economic narrative for the city. Regen Sydney is part of a wider network of ‘regen city’ initiatives that have emerged in Melbourne, Sydney, Brisbane and Adelaide during 2021, all exploring ways to support connect city-level policymaking and vision with community demand, grass-roots initiatives, and professional networks, using the lens of the doughnut economics approach of Kate Raworth and building on the emergent experience of multiple cities globally (especially Amsterdam) sharing their experience through the global Doughnut Economics Action Lab. Law and technology are core to the innovation needed to make effective connections between the data underpinning urban policymaking, professional expertise and new directions in economic development: data-sharing platforms, hybrid legal enterprise structures, social procurement innovation, possible legal personality for decentralised autonomous organisations set up on blockchain: all these are elements of future pathways for cooperative and regenerative platform economies.

Specific opportunities for this work to make a difference in the context of Sydney in 2022:

  • New mandate for city council after December 5 election: existing strong relationships with two city councillors highly likely to be re-elected, and relationships with City of Sydney CEO from prior Allens-Hub supported work with the Sydney Commons Lab
  • 2022 will be the year the City of Sydney finalises its Sustainable Sydney 2050 Vision
  • Opportunity to seek co-funding from the Sustainable Development Reform Hub to develop ‘threaded case studies’ on the right to repair (and food) that will connect two Allens Hub projects with two topical issues (current Productivity Commission enquiry into right to repair and the recently concluded UN Food Systems Summit which will now be implemented at domestic level – Sydney Commons Lab work of 2019 also focused on food and can be built upon here)

Stream Leader Matthew Kearnes

In recent years debate concerning the right to repair has emerged as a critical issue at the interface between law, technology and society.

Commencing in 2021 this stream the current and in-progress outputs of the stream address the intersections between repair practices in specific contexts and the wider legal and regulatory debate concerning the need to reflect rights to repair in contemporary intellectual property and consumer law.

Current work in this space has focused specifically on debate concerning the right to repair in Australia, contributing to public commentary and consideration of a series of proposals made by the Productivity Commission, with additional detailed work focused particularly on repair practices engaged in the off-grid solar market.

Stream Leader Lisa Burton Crawford

This stream examines the relationship between technologies and the rule of law. Particularly, it examines whether the increasing use of technologies by government can enhance or diminish the rule of law and the more specific public law values or principles it entails, such as predictability, transparency, accountability, and the relationship of reciprocity between those who wield public power and those who are subject to it. The stream brings together the expertise of leaders in law & technology with those with expertise in constitutional and administrative law, legislation and statutory interpretation in order to elucidate the public law implications of this trend.

The use of technology by government is the subject of a burgeoning body of academic literature. A unique contribution of this stream, and its focus in 2022, will be to examine one technology which remains comparatively new and unexplored: that is, the potential use of technology to assist in the design of legislation. This includes the idea of ‘rules as code’, and other more specific technologies used to draft, navigate and interpret increasingly complex legislation. While the rules as code movement is a broad church, its central tenet is that legislative rules which are to be implemented by automated systems should be designed with that goal in mind. This has the potential to alter the nature of legislation and destabilise dominant theories of its role. (Stream members will continue to examine broader rule of law issues raised by government use of other technologies as they arise.)

Stream Leaders Susanne Lloyd-Jones and Sophie Vivian 

Trust and trustworthiness have emerged as powerful explainers of behaviour of diverse phenomena such as states, individuals, organisations and technology in recent law, policy and technology contexts. In these contexts, trust and trustworthiness are being used as a theoretical or technical term without proper explication of the meaning or parameters of the concept. It is fashionable to talk about ‘loss of trust’ and ‘trust deficits’ in democratic institutions, generated by the erosion of the public sphere, the hollowing out of state functions, privacy violations, and dis- and mis-information on social media and other digital platforms. It is also common to refer to ‘trust deficits’ around the development and deployment of technology, including advances in technology.

Arguably, trust and trustworthiness have become catchwords for other concepts, such as reliability, relevance, consistency, fairness, integrity, cooperation, accountability and transparency.

Citizen and civil society trust narratives often concern accountability, transparency and democratic legitimacy. These narratives may relate to trust in business organisations, governments and the media. Narratives of distrust and untrustworthiness may be fuelled by fear of government overreach or misuse of law enforcement and security agencies’ powers and functions. In consumer narratives, trust and trustworthiness relate to the interaction with products and services customer’s use or consume, and whether they experience adverse incidents, such as getting scammed or phished, privacy and cyber-bullying issues in online contexts. In business, trust and trustworthiness are critical elements, especially in critical infrastructure industries and their supply chains. The trust narrative is about mutual trust and sharing confidential information, trade secrets and know-how with government and each other. It concerns the trustworthiness of the links in the supply chain. It has been argued that trust is not an appropriate concept to apply in a business context and that it is reliability, not trust, that is required.

The Stream is designed to interrogate the use of the concepts of trust and trustworthiness in cyber security contexts. Rather than relying on ‘trust’ and ‘trustworthiness’ as the grand descriptor for different relationships, contexts and perspectives, the stream seeks to discover whether other concepts, such as cooperation, reliance, reliability, transparency, accountability, effectiveness or simply a privacy enhancing technology that satisfies consumer expectations, more accurately define the context or relationship. Explicating the nuance embedded in the concepts of trust and trustworthiness is not just a theoretical exercise but will enhance the predictive power of the relevant disciplines that routinely use these concepts. By predictive power, we mean the ability to predict an outcome.

The Stream’s objective is to critically analyse the use of trust and trustworthiness concepts and narratives in a cyber security context. It will explore the implications of their use, especially in contexts where there is potential for exploitation and manipulation, or less monitoring or oversight, because an agency, entity, product, business or platform holds itself out as ‘trustworthy’, or enjoys trust, or responds to scrutiny with ‘trust us’. A trust narrative of this sort may make review and monitoring difficult if there are actors and institutions that believe in the inviolability of their trust status.