New Publication on Human Machine Interaction, Human Agency, and Proportionality

Published 18 December 2023

Taylor Woodcock has published a new article entitled  ‘Human/Machine(-Learning) Interactions, Human Agency and the International Humanitarian Law Proportionality Standard’. The piece is published as part of a Global Society Special Issue on The Algorithmic Turn in Security and Warfare.

Abstract

Developments in machine learning prompt questions about algorithmic decision-support systems (DSS) in warfare. This article explores how the use of these technologies impact practices of legal reasoning in military targeting. International Humanitarian Law (IHL) requires assessment of the proportionality of attacks, namely whether the expected incidental harm to civilians and civilian objects is excessive compared to the anticipated military advantage. Situating human agency in this practice of legal reasoning, this article considers whether the interaction between commanders (and the teams that support them) and algorithmic DSS for proportionality assessments alter this practice and displace the exercise of human agency. As DSS that purport to provide recommendations on proportionality generate output in a manner substantively different to proportionality assessments, these systems are not fit for purpose. Moreover, legal reasoning may be shaped by DSS that provide intelligence information due to the limits of reliability, biases and opacity characteristic of machine learning.


Read the full article: https://doi.org/10.1080/13600826.2023.2267592