Gender and Artificial Intelligence in Justice, Peace and Security
Published 20 June 2023On 1 June 2023, Klaudia Klonowska took part as panellist in the event Gender and Artificial Intelligence in Justice, Peace and Security organized by the International Development Law Organization (IDLO), the Embassy of the Republic of Cyprus in the Netherlands, and the Embassy of Switzerland in the Netherlands. Klaudia represented DILEMA Project in the panel alongside Dr. Allison Gardner (Co-founder of Women Leading in AI), Irakli Beridze (Head of UNICRI’s Centre for AI and Robotics), and Dr. Fabricio Guariglia (Director of the International Development Law Organization (IDLO) Branch Office in The Hague) as a moderator.
The event was organized to discuss risks of biased AI in peace and security sectors, for example, military, justice, and law enforcement. The discussion centered specifically around issues of gender bias and aimed at highlighting good practices that actors in these sectors can implement to improve AI design and ensure non-discrimination.
Key take-aways from the panel how to tackle gender bias specifically in the use of AI in the military context:
- Addressing biases in AI necessitates a crucial initial step of examining the gendered impacts arising from historical and contemporary military practices. Failure to acknowledge and rectify these gendered impacts perpetuates diverse impacts for both women and men in warfare. There is a risk that the utilization of (archived) data pertaining to these practices as a basis for machine learning algorithm training can embed biased practices within these technologies.
- Biases in datasets can arise from the current gaps in data collection concerning women's involvement in conflict, including aspects such as participation in hostilities, human trafficking, war-related mortality, and sexual and gender-based violence. To mitigate this issue, it is imperative to support comprehensive data collection during armed conflicts that encompasses both genders.
- Given the predominance of male representation in military institutions, the identification and mitigation of gender biases in practices, as well as in the design and development of AI, may be hindered. Moreover, military technologies predominantly cater to male-centric conditions. To address this challenge, military institutions may consider the inclusion of gender advisors, whose role would involve raising awareness about gender biases and promoting gender equality in both practice and technology design.