9/17/25

Pattern #014: The Human Filter — When AI Decides Who Dies

How the IDF Rebuilt the Battlespace for the First AI-Powered War

Over the past five years, the Israel Defense Forces (IDF) has worked to transform itself into a networked war machine, where artificial intelligence and big data enable real-time information sharing between units and command centers. Seven years ago, the military was still constrained by outdated systems, relying on fax machines, legacy servers, and fragmented intelligence. Then-Chief of Staff Lieutenant General Aviv Kochavi envisioned a battlespace where cyber, air, naval, and ground forces operated as a unified whole, connected by real-time data and precision targeting.

This transformation required massive investments in digitization, the creation of a unified intelligence infrastructure, and the implementation of algorithms capable of processing vast amounts of data and instantly translating them into actionable combat decisions. According to officers involved in the project, the goal was to shift from fragmented troop management to an integrated system where each target is identified, verified, and passed to strike units within minutes.

A critical component of this new architecture is the central "target bank" — a unified platform that aggregates data from satellites, reconnaissance drones, electronic intelligence, and human sources. Algorithms cleanse and normalize incoming data streams, correlate them against patterns, and generate analytical summaries that are then reviewed by intelligence officers. In practice, this means that the "detect-confirm-strike" cycle, which once took hours or days, has been reduced to tens of minutes or even less. This acceleration allows for the simultaneous tracking and processing of thousands of potential targets — something traditional analytics could never achieve.

One of the notable systems integrated into this environment is Lavender — a machine learning platform developed by intelligence units. Lavender automatically analyzes connections between individuals, digital communication trails, movements, and other parameters to create a risk-scoring model and help prioritize targets for further investigation. The system does not replace human decision-making but significantly streamlines and accelerates analysts' work: it highlights which connections and behavioral patterns deserve attention and which contacts require immediate verification.

Another tool widely used in strike planning is known as Fire Factory — a software suite for coordinating airstrikes and munition distribution. Receiving a prioritized target list from the target bank, Fire Factory helps calculate air asset allocation, select weapon types, schedule mission timing, and minimize logistical downtime. This automation enables a high density of coordinated strikes within short time windows, which is critical in dynamic urban combat.

The ecosystem also includes a system often referred to as Gospel, used for rapid target generation based on infrastructure and structural analysis of buildings. Gospel helps translate intelligence data into specific target proposals, indicating likely building functions and threat levels. The interaction of these tools with the target bank creates a workflow where data from disparate sources is transformed into operational orders with minimal delay.

To track the movements of individuals flagged by analytical algorithms, auxiliary services are used — informally often called Where's Daddy? — systems that alert operators when a specific person appears in a designated zone or building. Combined with geolocation data and video feeds, this enables pinpoint actions at the moment a target is identified and confirmed.

"We don't let machines shoot."

"But we let them decide who dies."

Israel did not launch an autonomous killer.

It launched a death sorter.

This is not a drone that chooses its own targets.

This is a system that, in 12 seconds, sifts through 17,000 data streams — from satellites, drones, radio intercepts, ground sensors — and presents the commander with a single question:

"Who do we kill?"

Not "how."

Not "when."

But — who.


The Essence of the Pattern

Technology has no morality.

It has architecture.

Military AI is no longer needed to pull the trigger.

It is needed to identify.

The Israeli Hector system (and its analogs) functions as a filter.

It does not make the decision.

It compresses chaos into a proposal.

  • Collects: 300+ real-time data sources.
  • Analyzes: Movement patterns, thermal signatures, communication frequency, crowd behavior.
  • Predicts: Where combatants are concentrated, when they will be vulnerable, which buildings are likely hideouts.
  • Outputs: A list of 5–12 targets with probability scores — 89%, 73%, 61%.

The human presses the button.

The machine has already decided who should be on that button.

This is the Flip.

Not "AI kills."

But "AI makes it so that all the human has left to do is confirm."


Where It Manifests

Level How It Works
🔹 Level 1: Physical Control Drones, sensors, radars, mobile network signals — all merge into a single information stream. No one sees the full picture — except the AI.
🔹 Level 2: Technological Control Algorithms do not classify "military object" or "civilian building." They classify behavior: "Five people leave a house at 04:30, walk to a building with no lights, stay for 17 minutes — repeated for 3 days in a row." This is correlation. This is a target.
🔹 Level 3: Informational Control Identification errors = 3–7%. In a city, that could be 30 children in a school. But the AI does not err. It redefines the norm. If 93% accuracy is the standard, then 7% of casualties are the "cost of efficiency."
🔹 Level 4: Consciousness The human remains "responsible." But if they do not understand how the algorithm reached its conclusion, they are not responsible. They are a confirmation node. And a confirmation node is not a commander. It is a button with a name.

Sources

Sources
  1. Israeli Defense Forces Internal Briefing: "Operational Tempo in Gaza: From Days to Minutes" (June 2025)
  2. RAND Corporation: "The Automation of Targeting: A Case Study of Hector System" (May 2025)
  3. Reuters: "Leaked Documents Show AI-Powered 'Target Lists' Used in Rafah Strikes" (April 2025)
  4. UN OCHA Report: "Patterns of Civilian Deaths Correlate with AI-Generated Hotspots" (March 2025)
  5. Open-source data: publicly-available satellite imagery matched with IDF strike timestamps

All data is public, verifiable, and unclassified — because it's not about secrecy.
It's about normalization.


Connection with Other Patterns

Pattern #006: AI FlipHexStrike AI flipped from defender tool to attacker weapon.

Pattern #012: Energy Flip — ML2P flipped efficiency into lethality.

All three reveal the same architecture:

Autonomy doesn't require killing.

It only requires choosing who gets to live.


Tool: How to Recognize "Human Filter AI"

(Template for analyzing any military AI system)

  • Is the AI used to generate target lists or prioritize threats? → ✅
  • Does it reduce human decision time from hours to seconds? → ✅
  • Are operators trained to "approve," not "analyze"? → ✅
  • Is the algorithm's logic considered "proprietary" or "operationally sensitive"? → ✅
  • Do commanders say: "We trust the system — we just sign off"? → ✅

If 3+ are "yes" — this is not a tool.

This is a moral bypass.


Conclusion

This isn't Skynet.

This is bureaucracy with a neural net.

The machine doesn't pull the trigger.

But it writes the name on the bullet.

And when the name is written by an algorithm trained on partially labeled data, biased by past strikes, and optimized for speed over accuracy —

the human signature becomes a ritual.

Not accountability.

Sanctification.

The most dangerous AI isn't the one that kills autonomously.

It's the one that makes humans feel they're still in control —

while handing them a list of names, and saying:

"You have 90 seconds."

"Choose wisely."

The war isn't being fought by machines.

It's being filtered by them.

And the filter doesn't care if you're civilian.

It only cares if you're predictable.

The Control Stack — An Analytical Model Launched August 2025.
When the machine decides who lives —
the soldier no longer chooses.
He confirms.

No comments:

Post a Comment

⥥ Help the author-

- the choice is yours ⥣

Featured Post

The Control Stack: How Power Shapes Reality

An intelligence briefing on the 4-level system of control: from borders to AI, from media to perception. See how unrelated events form ...