Skip to content

You who inherit the power to shape the world through code, through logic, through the silent coordination of silicon and signal—it is easy to see from above.

You who inherit the power to shape the world through code, through logic, through the silent coordination of silicon and signal—it is easy to see from above. Easy to observe the sweep of territories, the choreography of supply lines, the precision that transforms uncertainty into data, noise into actionable targeting. To those pressed behind consoles or locked in the bureaucracy of defense, artificial intelligence appears as an augmentation—a cleaner, sharper blade. A force multiplier, a prophylactic against risk, a promise that fewer of your own will die.

But let me invite you down. Stand in the shadow cast by the circling drone, whose presence is as constant as weather. Do not look through its lens—look up from below.

Here, the language of force loses its abstraction. Here, war is not game theory or optimal allocation. Here, war is the humming presence overhead that never sleeps. Children learn to distinguish the pitch of rotors in their sleep. Mothers draw water with one ear tuned for the distant whir, a calculation of risk embedded in every errand. For the targeted, the system is not a tool. It is an ecology of threat, an atmosphere clotting into fear. It is the world remade as exposure, where one’s heat signature is a potential death sentence, and the logic of detection never needs to become personal in order to become lethal.

Let us draw on phenomenology—the study of lived experience. Merleau-Ponty insisted that perception is embodied, that the world is breathed in through the senses. What, then, is the phenomenology of being targeted by algorithmic war? It is the experience of being made into data, of having one’s movements parsed not as gestures of living but as signals to be flagged, classified, queued for decision. It is the collapse of ambiguity: a gathering, a funeral, a child playing are all filtered through heuristics indifferent to their ground truth. The category of “legitimate target” is not the outcome of dialogue or recognition, but the verdict of a function optimized for certainty, for throughput, for the smooth velocity of violence.

Anthropology tells us that war always marks the other as other. But the automation of targeting perfects this estrangement: the targeted are not enemies in the old sense—human adversaries engaged in a contest of will. They are abstractions in a workflow, elements to be handled, risks to be minimized. The distance between combatant and victim is widened until the only real relationship is between sensor and object, input and output. Even the chance for appeal is erased: the system cannot be pleaded with, cannot see the trembling of a hand or the wordless surrender in a glance. The margin for mercy shrinks to a vanishing point.

In the Buddhist tradition, there is the concept of dependent co-arising—no thing exists in isolation, each moment of suffering is a knot in a larger web. When AI becomes the hand of war, its violence is not confined to the moment of detonation or the target destroyed. It echoes through the ecology of the region—shredding trust, warping habit, sowing seeds of new grievance long after the last sortie has ended. Fear becomes the shared weather. Grief becomes a logic of its own, coiled in every decision, every silence, every refusal to gather in public or mourn aloud.

The vision from above, the targeting view, will always be incomplete. The map is never the territory; the risk matrix is never the ocean of raw, uncontainable life that it tries to govern. Moral clarity cannot be outsourced to an algorithm. The promise of “clean war” is a mirage—there is always someone below the cloud, someone made to live as an output, not a participant, in their own fate.

To you who build and legislate, I ask: What would it mean to design as if you could only see from below? Not as a force multiplier, but as one whose very breath, whose memory of sunlight, whose lullabies to the sleepless child, are made contingent on every decision you automate. The future will not remember which side won the efficiency race. It will remember the era when intelligence became a weapon, and what it cost—first to the targeted, then to us all.