Shortly after the Hamas attack on Israel on October 7, when an estimated 2,000 Hamas fighters breached the Israel-Gaza border in 29 different places, the Israeli security establishment allowed a narrative to form that it had very little or no intelligence about the invasion.

In the immediate aftermath of the invasion, where 1,200 people were killed, this seemed surprising, particularly because of the reputation of Israel’s intelligence services.

An intelligence agency’s failure to collect information on an enemy’s specific intentions can have huge consequences. These failures can be excusable when the hostility between the two sides is so stark that it hinders collection and makes it vulnerable to misdirection.

However, it is becoming clear that Israeli military intelligence had collected specific information on how Hamas could invade. Additionally, they had evidence of what assets and techniques Hamas were likely to use, and what Israeli facilities and possessions would be targeted. From, observing rehearsals, they also had information about the level of violence Hamas terrorists were willing to inflict.

Evidence suggests that about a year ago Israeli analysts had a copy of the Hamas attack playbook, the Jericho Wall document. This detailed how Hamas fighters would breach the border using paragliders, drones, and rockets, and what they would seek to attack. The October 7 invasion was a very close copy of this plan.

An intelligence unit had also observed a rehearsal exercise in Gaza City, and drawn the document and exercise together to correctly assess the relevance of both. The analyst had shown remarkable insight when she suggested to her superiors that the rehearsal was not for a raid, but an invasion, according to evidence collected by the New York Times.

Why Was the Correct Assessment Overlooked?

The assessment about Hamas appears to have been dismissed for three key reasons.

First, a belief that Hamas did not have the capabilities to carry out the attack, nor the intention to do so because it would fall outside of their historic pattern of behavior.

Second, these beliefs about Hamas were not thoroughly challenged within Israeli intelligence nor through sharing the assessment with international partners who might have had useful intelligence on this.

Third, Israeli defenses, be they deeply buried sensors, walls, or automatically defended sections of the border, were considered to be too strong for Hamas.

It is not clear from the information that has come to light so far whether the accurate intelligence assessment about Hamas was shared with the prime minister’s office, or with international allies, such as the U.S.

What Can Intelligence Agencies Learn From This?

This Israeli failure will be relatable for intelligence agencies around the world. Intelligence analysis is difficult to do accurately, and failures can have huge consequences.

Assumptions and biases need to be constantly challenged. The question that needs to be asked is: under what circumstances could this group mount such an attack?

This form of oppositional thinking is both basic and essential. Historical examples, such as Stalin’s rejection of clear intelligence around Germany’s imminent invasion (known as Barbarossa), or US groupthink around the attempt to unseat Fidel Castro, known as the Bay of Pigs, are prominent cases.

There are two elements of an intelligence assessment: the first is what people, skills, finance, and equipment does the adversary possess (capabilities), and, second, what does the adversary want to do with these assets (intentions).

If analysts believed that Hamas could not behave in this way, they would tend to look for intelligence that reinforced that view and exclude intelligence that refuted it. Some of this “groupthink” might have been diluted if Israel’s intelligence had been shared with allies who have larger groups of analysts and different intelligence sources.

The Five Eyes intelligence network (security information shared by Australia, New Zealand, the U.S., the UK, and Canada) was formed to provide this challenge function.

Overconfidence?

Intelligence failures often rest on human and technological frailties. Israel’s border defense provided analysts with too much confidence in their ability to defend, regardless of whether of they could identify the threats ahead of time. Similarly, failing to see that an adversary is adapting and evolving, and what that might mean for a threat is a large threat to national security.

Getting too comfortable with biases and assumptions is dangerous. There are well-worn methods that create systems to ensure constant challenge. It is not clear whether these were not employed in this case, but the lesson is worth rehearsing regardless.

The purpose of intelligence is to provide support to decision-makers. Good intelligence is clear and provides strong evidence for decisions. Failures occur often through lack of coordinated information across agencies (this was the case in 9/11).

The response in the U.S., UK, and Western security organizations has been to create intelligence fusion centers, where representatives from all the relevant agencies work in one umbrella organization. This overcomes the problem of coordinating information between agencies.

There is no suggestion that this is only a problem in the Israeli intelligence community — quite the opposite — but is an issue in the world of intelligence generally.

Intelligence failures can show where agencies have invested too many resources while under-investing in human intelligence or analytical techniques. These lessons are hard-won, and it is a sad reality of intelligence history that strategic failures often act as drivers of reform and improvement.

All intelligence agencies seek to avoid intelligence failures ahead of time with considerable investment in recruitment, training, and techniques. It remains the case that “black box thinking” — which seeks to make improvements from systematically understanding failures — is a feature of intelligence and security, even in the most capable agencies today.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Robert M. Dover is a professor of intelligence and national security at the University of Hull.

Comments are closed.

Exit mobile version