David Ewalt's March editor's letter is clever, honest, and brave. He confesses to using AI as a writing partner, invites the outrage, and then disarms it with a question so reasonable it is almost disarming: does it matter who held the pen, so long as the ideas were true?
I agree with him. I have written many articles with Claude a few inches away—my ideas, my soul, Claude's structure. I am 83 years old and I have no patience left for false modesty about useful tools.
But I want to say something to Mr. Ewalt and to this magazine that goes well beyond the authorship question, because the authorship question—however interesting—is a parlor discussion while the house burns down.
The Rear View Mirror
AI is, at its core, a rear view mirror. Every model ever trained is a reflection of what humanity has already thought, written, and published. It is extraordinary at that task. But no one steers a car by looking backward. The road ahead—open, uncertain, and full of the genuinely unprecedented—does not appear in the mirror.
Waze will reroute you around known traffic. It cannot warn you about the deer standing in the darkness a thousand feet ahead.
When that deer appears, something happens that no language model can replicate: my foot moves before my mind finishes the sentence. Embodied experience, consequence, and a lifetime of navigating the unexpected compress into a single instant. That is not a biological curiosity. It is the irreplaceable value of a human being who has actually lived.
I think about my new great-granddaughter. I cannot imagine what her world will look like in twenty years. Neither can you. Neither can the AI. And that is precisely the point.
The Ghost Outside the Window
The ghost in your machine, Mr. Ewalt, is harmless. It helped you write a thoughtful letter. The ghost in our culture is something else entirely.
At this moment in history, science, scholarship, and truth are being destroyed. Not challenged. Not debated. Destroyed.
The institutions that produced and protected knowledge—universities, research agencies, independent journalism, scientific bodies—are under coordinated assault by forces that understand, correctly, that facts are an obstacle to the concentration of power.
This is not hyperbole. It is the observable reality of 2026.
Later in this same March issue, Scientific American runs a piece titled "The Age of Impersonations." It is a careful, measured examination of deepfakes and digital deception. I read it with respect and frustration in equal measure.
Tepid.
We are not living in an Age of Impersonations. We are living in an Age of Greed, Deceit, and Grandiosity—a moment when the most powerful actors in our civilization have concluded that reality itself is negotiable, that expertise is elitist, and that science is merely opinion with better footnotes.
AI did not create this crisis. But it is accelerating it at a logarithmic pace. The same rear view mirror that helps Mr. Ewalt write a graceful editor's letter is being weaponized to manufacture false histories, flood public discourse with synthetic authority, and make the distinguishable indistinguishable.
The hallucination is no longer just a technical failure of a language model. It has become the operating principle of our public life.
A Word to Scientific American
Scientific American was founded in 1845. It has survived wars, depressions, the McCarthy era, and the tobacco industry's decades-long campaign to corrupt the scientific literature. It has done so by insisting, relentlessly, that evidence matters, that methodology matters, that the difference between what we know and what we wish were true is not a matter of opinion.
That insistence has never been more necessary or more endangered than it is today.
And yet I read this issue and I find an editor wrestling admirably with AI authorship, and a feature framing our epistemic collapse as an "age of impersonations," as though the problem were merely cosmetic—a question of masks and mimicry rather than the wholesale demolition of the shared reality on which science depends.
Mr. Ewalt, you asked your readers what our biological processors think. Here is what mine thinks, after 83 years of paying attention:
The decline in our values will lead to the demise of scholarship. It has happened before. It can happen here.
The barbarians do not always announce themselves at the gate. Sometimes they are already inside, wearing the vocabulary of populism and the technology of persuasion, and the scientists are still in the laboratory, carefully documenting the weather while the climate changes around them.
The Driver's Seat
I do not write this in despair. I am constitutionally incapable of despair. I paddled alone across the Molokai Channel at 70. I started an AI safety company at 83. My philosophy is simple: Don't lament. Engage.
But engagement requires honesty about what we are engaging with. The ghost in the machine is a fascinating philosophical puzzle. The ghost in our culture is an existential threat to everything this magazine has stood for across 180 years.
Scientific American has the credibility, the readership, and the moral authority to say so plainly.
The rear view mirror will not save us. We need scientists, editors, and institutions with the courage to look through the windshield—to name what is coming, not merely describe what has already passed.
My biological processors, waterlogged from ocean crossings and somewhat the worse for eight decades of use, believe this magazine is capable of exactly that.
I am watching to see if I am right.