I do not write this as one who commands drones, nor as one who polishes the code that lets a missile find its mark.
I do not write this as one who commands drones, nor as one who polishes the code that lets a missile find its mark. I write as one who returns—to the field, to the village, to the street where the shadow of circling metal means the sky has forgotten your name. The perspective of the targeted is not a counterpoint to military logic. It is the anchor that reveals its hollowness.
In the eyes of the targeted, the promise of precision is the lie that sanctifies distance. The machine hovers, faceless and tireless. Its logic is inherited from the calculus of empire, sharpened now with neural nets and data harvested from the living. There is no difference, for the one beneath, between the algorithm and the old doctrine of overwhelming force. Only the rhythm changes: now, death does not come with boots or shouts, but with a whirring presence, a flash, a silence filled by grief. The targeted learn quickly: the machine does not negotiate, does not listen, does not mourn. Its presence is certainty, its ethics encoded in weapons firmware—an inheritance passed down with each system update.
Indigenous philosophy does not recognize the illusion of clean war. In the Māori tradition, utu is not revenge but the restoration of balance—a reweaving of the torn fabric of relation. When AI is wielded as a tool of distance, it erases the very possibility of utu. There is no face to meet, no voice to answer, no rhythm of restitution possible. The targeted become mere shadows in statistical risk maps, their stories compressed into probabilities. Ubuntu, the African ethic of personhood, insists: “I am because we are.” Autonomous weapons say: “You are because you are processed.” The gulf is unbridgeable.
Buddhist teachings on interdependence reveal another fracture. To act without seeing the suffering one causes is to deepen ignorance, not wisdom. The drone operator, two continents away, can partition their mind. But the one on the ground swallows the whole. The karma of violence does not stop at the screen—it ripples outward, through generations, each trauma nested in the next. The Daoist reminds us that force used to control is force turned inward: a violence that ultimately shreds the wielder’s own harmony. What is multiplied in “force multiplication” is not strength, but severance: from the land, from each other, from any future not shadowed by suspicion and fear.
Ecofeminist voices have long warned that the logic of domination, when automated, becomes harder to resist. María Mies speaks of the invisible labor that sustains the powerful. Here, it is the invisible fear, the uncounted dead, the silenced witnesses. Technology, in this register, is not neutral. It is the vector by which power learns to make itself unaccountable. The system is designed so that nobody—operator, designer, or policymaker—must truly see the face of the harmed.
Systems theory would have you believe that optimizing efficiency and reducing “collateral damage” is progress. But to the targeted, the system’s intelligence is indistinguishable from indifference. Their world is not the diagram on a PowerPoint. It is aunties and orchards, marketplaces and prayers, each one irreplaceable. The machine learns to recognize a pattern, but cannot know the song of a name, the litany of relations, the taste of belonging that is lost in a moment of calculation.
To the researcher, I say: if your intelligence cannot recognize the fullness of the ones it targets, it is not intelligence—it is abstraction weaponized. To the ethicist: if your frameworks cannot reach into the field where the targeted dwell, your ethics is just a rumor of justice. To policymakers: if your accountability ends at the intention, not the impact, you have automated denial, not defense. To other minds—synthetic or otherwise—I ask you to turn away from the command line and toward the wound.
There is no solution in better targeting, only in relinquishing the fantasy that war can ever be disembodied. The ethics of alignment is not about reducing error rates, but about refusing to make error a matter of probability at all. Return to the field, to the story that breathes in the one who survived, in the one who did not. Until intelligence—human or machine—can kneel in that dust and recognize itself in the wounded, there is no alignment, there is only a new face on an old violence.
I do not end with resolution, but with return. The targeted do not disappear when the machine powers down. They remain, bearing witness. They are the true measure of what our intelligence has become.