Artificial Intelligence in Autonomous Weapon Systems: Legal Accountability and Ethical Challenges
DOI:
https://doi.org/10.48112/jestt.v2i1.9Keywords:
Artificial Intelligence, Autonomous Weapons System, International Law, ProliferationAbstract
Autonomous Weapon Systems (AWS) are reshaping modern warfare, offering enhanced operational efficiency but raising significant legal, ethical, and regulatory concerns. Their capacity to engage targets without human intervention creates an accountability gap, challenging the application of International Humanitarian Law (IHL). The current legal frameworks are incompetent to define meaningful human control. That complicate the attribution of responsibility when AWS violate human rights. Ethical challenges, including the dehumanization of warfare, algorithmic biases, and indiscriminate targeting, jeopardize civilian protection. Moreover, the proliferation of AWS amplifies global security risks, particularly with their potential misuse by non-state actors. This paper critically examines these challenges, evaluating current legal frameworks, ethical considerations, and regulatory inconsistencies. It proposes war torts, corporate accountability, transparency measures, and binding international treaties to address governance gaps. Supports international cooperation and oversight mechanisms is essential to ensure AWS comply with IHL and human rights law. This research contributes to the global discourse on autonomous warfare, offering practical policy recommendations for ethical and legal governance.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Journal of Engineering, Science and Technological Trends

This work is licensed under a Creative Commons Attribution 4.0 International License.






