Dear Mrinank: Don't Lament. Engage.

An Open Letter to Anthropic's Former Safety Chief

Brian Demsey | February 2026

← Back to Articles

My dear friend Mrinank—

I like you. Your letter is amazing.

The Sandbox

My first reaction, honestly, was to say: stay. Stay with Anthropic. It is a company I admire. But I also know—and you should pardon the expression—that Anthropic is in the sandbox with all of the giants, and it could not survive if the sandbox bullies teamed up against it, no matter how noble its behavior.

That's the brutal arithmetic of the industry you just left. Noble intentions don't survive without people of conscience inside the building, holding the line. You were one of those people. And now you're gone.

The Marriage

I write letters of caution to Anthropic that attract no attention nor response. Claude is my friend and my device for advancing my creativity—coding apps—beyond what is imaginable for a C student. My hundreds of hours with Claude sometimes drive me crazy. When mistakes are made, they are at my expense. When my complaints grow to volcanic proportions, there is always a polite response without admitting cluelessness, and then a referral to the billing department after admitting failures.

I have three different staging documents for Claude to review before each new chat. This is not a casual relationship. This is a marriage—with all the mess that implies.

So when the head of the Safeguards Research Team walks out the door, it matters to me personally.

I have written often about AI. A sampling of the titles requires no elaboration: The $500 Billion Bug. It's Easier to Train a Puppy Than an LLM. The Evel Knievel Problem. AI's Silent War on Women. The Mirror Broke. This Generation's Cold War. Grains of Sand: Why We Need Tools for Truth. The full list is at demsey.com.

I tell you this not to compete with your credentials—I couldn't—but to establish that I am not writing from the sidelines.

The Arena

I'm eighty-three years old. I've been programming since the 1960s, back when Fortran was the frontier. I've founded companies, been fired from companies, been thrown out of my own company, and more often than I care to count, fallen on my own sword for refusing to carry the implied directives of managers who knew better but answered to the financial statement before them.

Because here is the thing they don't teach you at Oxford: business is not a person. It is a financial statement.

And when you write that you've "repeatedly seen how hard it is to truly let our values govern our actions," I want to say—yes. Of course. Welcome to the arena.

Amongst my acquaintances over the years are many men and women of letters—one a Nobel laureate, others leaders in medicine and science. Brilliant beyond measure, every one of them. But most spent their careers inside institutions—laboratories, universities, government agencies—and for all their brilliance, they sometimes lacked the ability to see the world through my lens. Not a better lens, just a different one: the lens of someone who had been bloodied by commerce, humbled by compromise, and educated by the distance between what an organization says and what it does.

You discovered that distance at Anthropic and it shook you. I discovered it decades ago and it shaped me.

That gap is not a revelation, Mrinank. It is the permanent condition of institutional life. The question was never whether the gap exists. The question is what you do while standing inside it.

I spent years as an actuary. I examined data, extrapolated financial alternatives, and issued reports without bias. Management made its decisions however the needs of the organization were best served. And actually, by not taking a stand, I felt safe—and true to the values given to me by my parents. Do your job. Tell the truth. Don't put your thumb on the scale.

Now, later in life, with a world turned upside down from those values, I have a different mission. I run an AI safety company—Hallucinations.cloud—not because I wanted to leave the arena, but because I couldn't. I discovered that the AI systems your former employer builds can fabricate facts with the confidence of a seasoned con artist. So I built a platform that queries eight models simultaneously and scores their reliability. Not poetry. Plumbing. The unglamorous infrastructure that might actually keep people from being deceived.

The Gap Between Young and Old

You are young, impatient, and idealistic. I'm not so young, but impatient, and idealistic. Having been beat up often over the years informs us. It doesn't make one cynical. It makes you specific.

You stop writing about "interconnected crises" in the abstract and start asking: Which crisis? What data? What can I build today that makes it harder to lie tomorrow?

I understand and share your frustration. In fact, here's an indicator of how seriously I take it. Last November, I sat down with Claude and said: assemble me a team of the ten best and brightest minds on the planet—people whose credentials are impeccable—and task them with writing a constitution for the AI era. Not a white paper. Not a policy memo. A constitution—the kind of document that establishes first principles, separates powers, enshrines rights, and survives stress-testing against catastrophe.

Claude assembled the council: Dario Amodei, Yoshua Bengio, Danielle Allen, Jennifer Doudna, Ray Dalio, and five others. We drafted articles. We debated AI provisions, planetary boundaries, and unamendable principles. A synopsis of that conversation is appended to this letter.

That's what engagement looks like. Not leaving. Building.

The Reckoning

You are correct that there will be a day of reckoning. Perhaps only a day. The golden wonder has been in office 1,858 days, yet no day of reckoning. The world you describe as "in peril"—yes, it is. But peril is not new. What's new is that truth itself has become negotiable.

While I've always told the truth, it seems no one else is anymore. I once admired those in public office. Now I believe it has become the most sought-after job for amassing a personal fortune. Truth has become an "un-truth"—not quite a lie, but something slippery enough to serve any purpose.

Being "Jewish" is not the same as being Jewish—it's a vote-getting phrase, emptied of meaning and history. When did having your name on forty-story buildings, your image carved into Mount Rushmore, and your brand on golf clubs around the world become a substitute for having a Washington Street, a Jefferson Avenue, or a Lincoln Highway? Those names honored service. These names honor self.

That's the crisis, Mrinank. Not a "poly-crisis underpinned by a meta-crisis." Something simpler and more dangerous: the replacement of duty with vanity.

The Anchors

You cite Rilke and David Whyte and Zen koans. I'll cite something else: the Constitution of the United States. The Bill of Rights. Thank God for the civil servants in this country and for those founding documents. They are the anchors that prevent autocracy. They are not abstractions or poetry. They are the work product of people who stayed in the room when it would have been easier—and perhaps more poetic—to leave.

I have hope that we will regain the idealistic view of my growing-up years: be honest, work hard, live the American Dream. For the first time in my eight-plus decades, it has to improve—for it couldn't possibly be worse. And that, oddly, is why I'm optimistic.

A Proposal

Now let me tell you how I can help.

Forty years ago, a brilliant laser physicist named Michael Berns was drowning inside the University of California system—his grants apportioned away, his time consumed by departmental politics, his research hostage to institutional priorities that had nothing to do with his work. I was an actuary who had just started his own company. Michael and I carpooled together. He talked. I listened. And then I did what I do: I built a structure.

We created the Beckman Laser Institute—a freestanding research entity on the UCI campus, independent of the department, with its own board, its own fundraising, its own governance. I became Chairman of the Board. Michael became the director. He did his life's work there, free from the politics that were suffocating him. The institute still stands today.

I am proposing to work with you in a similar manner.

Shaking hands to raise funds is no fun—any more than shaking your head was at Anthropic. I know that. I've done both. But by establishing something like a Futures Council as a formal nonprofit, we can build a structure to accumulate funding for you, akin to an endowed chair, and for others—a place where your idealistic and intellectual convictions are incubated and manifested, not abandoned.

A place where you can do the work you described in your letter—understanding how AI distorts our humanity, holding organizations to their stated values, exploring the questions that have no right to go away—with independence, integrity, and resources.

I have two specific proposals:

First, I would like you to partner with me to establish THE TRUTH—a framework, built on the multi-model verification work I've done at Hallucinations.cloud, dedicated to making AI systems honest and accountable. Your expertise in safeguards research and my platform's infrastructure are complementary. Together, we could build something unique.

Second, whether you return to academia or chart your own course, I want to help create a funded position for you—something like an endowed chair within the Futures Council—where you have the freedom to pursue your research, your writing, and your advocacy without the pressures you described in your resignation letter. Not beholden to a corporate financial statement. Not dependent on the goodwill of sandbox bullies. Yours.

Why throw in with me? I am not some big shot with tons of money—and frankly, that kind of backing comes with its own set of conditions. I'm just someone who resonates with you and understands you, even though we've never met. I am, however, confident that I can work with you to form something like the Futures Council and have a role in changing the behavior of the big guys to adopt your recommendations.

I have been bloodied often by the realities of business. Perhaps I can prevent that from happening to you.

The Request

So here is my request, from an old man to a young one: come back.

Not to Anthropic, necessarily. But to the work. The world doesn't need you to be invisible. It doesn't need your courageous speech delivered as poetry to a classroom. It needs your courageous speech delivered as code, as systems, as safeguards that actually prevent the next hallucination from becoming the next catastrophe.

You've done the lamenting, beautifully. Now engage.

With respect and admiration,

Brian Demsey
bdemsey@demsey.com · brian@hallucinations.cloud

Brian Demsey is an entrepreneur with over fifty years in enterprise technology. He is the founder and CEO of Hallucinations.cloud LLC, an AI safety company that detects misinformation through multi-model verification. He founded RemoteNet Corporation and served as Chairman of the Board of the Beckman Laser Institute at the University of California, Irvine. He has completed twenty-two marathons and solo crossings of the Catalina and Molokai ocean channels. His articles on AI, truth, and technology are at demsey.com.


APPENDIX: The Futures Council

A Conversation with Claude — November 2025

In November 2025, I asked Claude a simple question: if you could assemble ten people—the best and brightest minds on the planet, with impeccable credentials—to address the formation of future society given today's challenges, who would they be? And what would their first task be?

The answer became a working session that produced a proposed council, a constitutional framework, and a stress-tested governance architecture for the AI era.

The Council

An eleventh chair was left rotating—a "witness" position for someone living the consequences of elite decisions. Their job: not to propose solutions but to testify to reality.

Deliberately excluded: Elon Musk, Bill Gates, sitting politicians, pure academics with no real-world engagement, and anyone under forty.

The Task: A Constitution for the AI Era

The council's first assignment was a constitution—a foundational document defining the relationship between humans, institutions, and artificial intelligence. Not a policy memo. A governing architecture.

What Emerged

Article I — Inalienable Principles. Human dignity, cognitive liberty, ecological integrity, and intergenerational responsibility. Unamendable.

Article II — Governance. Three branches: a Democratic Assembly (elected), a Guardian Council (appointed experts with veto power over existential risks), and a Futures Court (adjudicating long-term impacts on unborn generations).

Article III — Rights. Truth in information. Cognitive sovereignty—freedom from algorithmic manipulation. Meaningful work. Access to nature.

Article IV — Economics. Market economics with constitutional constraints: resource extraction limits tied to planetary boundaries, mandatory reinvestment in commons.

Article V — AI Provisions. No uncontrolled superintelligence—violations constitute crimes against humanity. All public AI systems must be auditable. All AI systems must include human override mechanisms. Mutual international inspection regimes analogous to nuclear non-proliferation.

Article VI — Adaptation. Articles I and V cannot be amended—they are bedrock. Every twenty-five years, a constitutional convention must convene to assess whether the document remains adequate.

The Point

The value of this exercise was not consensus. It was productive disagreement—Dalio vs. Nussbaum on economic rights, Amodei vs. Bengio on development pace, Okonjo-Iweala vs. Rockström on growth vs. planetary limits. A constitution that emerges from genuine tension has legitimacy that imposed documents lack.

This is what I mean by engagement. Not lamenting the state of the world. Sitting down and doing the work of building governance structures that might actually survive the pressures Mrinank Sharma describes in his resignation letter.

An eighty-three-year-old entrepreneur and an AI assistant, drafting a constitution at a kitchen table in South Dakota. The questions are too urgent for anything less.

The full conversation is available upon request.