Israel's Unit 8200 Uses Machine Learning to Identify Hamas Targets in Gaza

Israel's Unit 8200 Uses Machine Learning to Identify Hamas Targets in Gaza

In a recent video that has surfaced, a senior official from Israel’s cyber intelligence agency, Unit 8200, discussed the use of machine learning techniques in identifying Hamas targets in Gaza. The footage raises questions about the Israeli Defense Forces' (IDF) recent statement denying the use of artificial intelligence (AI) in identifying terrorists. However, the video shows the head of data science and AI at Unit 8200, known as “Colonel Yoav”, explaining how their “magic powder” of machine learning was used to find new terrorists during Israel’s offensive in Gaza in May 2021. He described the process of using their data science techniques to locate terrorists based on information about known individuals and find the rest of the group.

This revelation aligns with recent testimonies from members of the IDF about their use of an AI tool called “Lavender” during the offensive on Hamas. According to these accounts, the AI-generated database assisted intelligence officers in identifying potential targets in Gaza, leading to the discovery of tens of thousands of potential human targets. The IDF has responded to these claims, calling some of them “baseless,” but the testimonies are consistent with Colonel Yoav’s remarks during an AI conference last year. The video, which had previously gone unnoticed with less than 100 views, was hosted on the YouTube channel of Tel Aviv University.

Colonel Yoav’s presentation provided rare insights into how secretive military and intelligence bodies utilize opaque AI systems. He explained that Unit 8200 uses AI to predict whether someone is a terrorist by analyzing information about individuals believed to be part of terrorist groups and using that data to find the rest of the group. He highlighted the use of a machine learning technique called “positive unlabelled learning” and emphasized that human intelligence officers provide feedback to improve the algorithm and make final decisions. Colonel Yoav assured that the tools are meant to assist human decision-making and not replace it. Unit 8200 was able to identify over 200 new targets during the May 2021 offensive using their AI techniques, a process that previously took almost a year.

The IDF, in response to the video, approved Colonel Yoav’s participation in the conference but denied any conflict with their recent statement. They clarified that their AI systems do not select targets for attack and that the database in question is not a list of eligible targets. The IDF claims that the existence of a database of operatives in terrorist organizations is not denied, but the AI systems are not used to choose targets. The recent testimonies from intelligence officers confirmed the use of the Lavender tool to assist in targeting operations, but it was made clear that humans were involved in the authorization process, even if minimal human oversight was admitted.

The vision of integrating AI and intelligence personnel described by Colonel Yoav aligns with the model for targeting operations envisioned by Unit 8200 chief Yossi Sariel, as revealed in his book “The Human Machine Team.” The colonel highlighted how the IDF is transitioning from the “postcard age to the digital era,” where data-science driven solutions enable on-the-fly responses during battle. He expressed curiosity about how future operations would unfold in the digital realm.

While the video and testimonies have raised questions and controversies surrounding the IDF’s use of AI in targeting terrorists, it is clear that these tools are intended to assist human decision-making rather than replace it. The complex and evolving nature of warfare in the digital age necessitates the adoption of advanced technologies, but their ethical implementation remains a crucial consideration. As advancements in AI continue, it is imperative to strike a balance between technological capabilities and human judgment to ensure responsible use in military and intelligence operations.


Written By

Jiri Bílek

In the vast realm of AI and U.N. directives, Jiri crafts tales that bridge tech divides. With every word, he champions a world where machines serve all, harmoniously.