A new type of warfare is emerging in Gaza, characterized by the use of advanced artificial intelligence (AI) systems. The SETA report, “Deadly Algorithms: Destructive Role of Artificial Intelligence in Gaza War,” highlights Israel’s reliance on AI in military operations, significantly affecting civilian populations. AI tools like Lavender and Habsora automate critical decisions, often resulting in high civilian casualties and violations of international laws. For instance, Lavender can approve bombing targets in just 20 seconds, frequently misidentifying civilians as threats. This automation creates a “responsibility gap,” leaving accountability blurred when civilian deaths occur. Israel’s extensive surveillance systems track Palestinian activities to inform military strikes, leading to further human rights concerns. The international community’s silence on these practices raises alarms, as existing laws fail to address the rapid evolution of warfare technology. Urgent reforms are necessary to hold individuals accountable and prioritize human judgment over autonomous systems.
Source link

Share
Read more