Publication
Open Access

Ensuring the exercise of human agency in AI-based military systems: concerns across the lifecycle

Loading...
Thumbnail Image
Files
Ensuring_exercise_2025.pdf (1 MB)
Full-text in Open Access, Published version
License
Attribution 4.0 International
ISBN
ISSN
1388-1957; 1572-8439
Issue Date
Type of Publication
LC Subject Heading
Other Topic(s)
EUI Research Cluster(s)
Initial version
Published version
Succeeding version
Preceding version
Published version part
Earlier different version
Initial format
Citation
Ethics and information technology, 2025, Vol. 27, No. 50, OnlineFirst
Cite
BODE, Ingvild, NADIBAIDZE, Anna, WATTS, Tom, ZHANG, Qiaochu, Ensuring the exercise of human agency in AI-based military systems: concerns across the lifecycle, Ethics and information technology, 2025, Vol. 27, No. 50, OnlineFirst - https://hdl.handle.net/1814/93924
Abstract
The question of governing military applications of artificial intelligence technologies (AIT)Footnote1 has gained in salience in the early 2020s (Bode et al., 2023; Garcia, 2023). This is due to a combination of reported advances in AIT and the simultaneous growing use of these technologies in conflicts such as the latest Israel-Hamas war (2023-) and the Russia-Ukraine war (2022-). International initiatives, such as the Group of Governmental Experts on lethal autonomous weapons systems (GGE on LAWS) at the United Nations (UN), the Responsible AI in the Military Domain (REAIM) Summits, or the 2023 US Political Declaration on Responsible Military Use of AI and Autonomy (US Declaration) have put forward overlapping, non-binding lists of principles to guide and influence military practices in this space.
Table of Contents
Additional Information
Published online: 06 October 2025
External Links
Publisher
Version
Sponsorship and Funder Information
This research is part of the AutoNorms project which has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No. 852123).