Regulation in the public interest: Disruptive technologies in peace and security

Research strand coordinator: Dr Berenice Boutin

This research strand addresses regulation to safeguard and promote public interests. It focuses, in particular, on the development of the international regulatory framework for the military applications of disruptive technologies and the arms race in conventional and non-conventional weapons. The public interest of peace and security serves as the prime conceptual framework in this strand.

Researchers

Interns

Research themes
This research strand conducts research on disruptive technologies in peace and security (DTPS), defined as technological developments that can have disruptive implications for international security and international law. These include: military artificial intelligence (AI), data-driven warfare, biochemical weapons, and conventional weapons or dual use technologies with a disruptive potential (e.g. small arms, drones, cyber-surveillance). Our research focuses, in particular, on the development of the international regulatory framework for the military and security applications of disruptive technologies. The public interests of peace and security serve as the prime conceptual framework in this strand.

Two main lines of enquiry guide research within this strand. On the one hand, we question how legal norms and ethical values can shape technologies, and on the other hand we analyse how technologies challenge our legal norms and ethical values.

1) Identifying and promoting alignment of DTPS with shared public values through international law and ethics. The first line of research focuses on mapping relevant values and principles in law and ethics, identifying possible conflicts of values, reflecting on the balance of public and private interests, and exploring the interface of legal principles, legally-embedded values, ethical values, and public interests. For instance, in the DILEMA project, research seeks to identify and safeguard fundamental values (e.g. human dignity, human agency, accountability), and to translate values and principles in requirements and processes for military AI in order to promote alignment. Recognising the intersubjective and heterogenous nature of values, a critical questioning of the very notions of ‘values’ and of ‘shared public values’ is an integral part of this line of research. It explores how public interests and values are constructed and justified in the international discourse on the regulation of technology, and how notions of public interest relate to expressions of public values.

2) Exploring the adaptability, limits, and transformation of existing international law in the face of change and novelty, with DTPS as a case-study. The second line of research explores how international law addresses change, in particular technological change that is disruptive to international security. Recurring questions include whether new laws are needed for new technologies, how existing laws are interpreted in a new context, how actors generate or not agreement on both substance and processes, how international law evolves through these interpretative exercises, and how new norms are developed. This line of research analyses how technological developments affect international law processes and concepts, and questions to what extent and in which sense technologies impact public interests.

Research in the strand is supported and structured by a number of research projects and initiatives, including the DILEMA project (Designing International Law and Ethics into Military Artificial Intelligence), the I2RAMP project (Implementing International Responsibility for AI in Military Practice), the ELSA Lab Defence Project (Ethical, Legal and Societal Aspects of AI in Defence), and the International Arms Control Law Hub.

Individual research topics in the strand include:

  • The meaning of human agency and human judgement
  • The role of such notions in enabling compliance with international law
  • The modelling and embedding of legal norms in technologies
  • Questions of accountability and responsibility
  • Implementation in practice of frameworks for a responsible use of technologies, both from a technical and policy perspective