Grains of Sand: Why We Need Tools for Truth

Building navigation instruments for an ocean of information

Brian Demsey | Published in The Information | January 2026

← Back to Articles

As published in The Information

I started the new year sitting on the sand at Dana Point, California, watching waves arrive from storms that occurred a week ago, seven thousand miles away. A grain of sand might take years—or millennia—to drift across that expanse. The ocean has all the time in the world.

Then my phone buzzed with news from Switzerland. At Le Constellation bar in Crans-Montana, a New Year's celebration had become deadly. A flashover—all combustible material in a room igniting almost simultaneously—killed approximately forty people and severely injured over a hundred more. The suspected cause: sparklers in champagne bottles touching a wooden ceiling.

The information reached me faster than the next wave reached my feet.

This asymmetry—the speed of harm versus the patience required to build anything protective—is why I founded Hallucinations.cloud. But I want to reframe what we're building, because the name itself may mislead.


Not Detection—Navigation

We are not primarily in the business of catching AI systems making mistakes. We are building tools for finding truth.

The distinction matters. "Hallucination detection" frames the problem as adversarial—us versus the machines. But the real problem is navigational. In a world where information moves at the speed of light and verification takes patience, how do ordinary people—not just AI researchers, not just the technically literate—find reliable ground to stand on?

Consider a grain of sand floating on random currents across the Pacific. It has no agency. It arrives when geology permits—if ever. Now consider a navigator with instruments: charts, compass, the ability to triangulate position from multiple reference points. Same ocean, radically different outcomes.

Our multi-model platform queries eight different AI systems simultaneously. When they converge, confidence rises. When they diverge, that divergence itself is information—a signal that this territory requires caution, further investigation, human judgment.


The Morning of December 30, 2024

Exactly one year before I sat on that beach, I was driving to my granddaughter's wedding through a mountain pass. A tire rolled loose from another vehicle. Physics took over. My car came to rest precariously, thirty miles from the nearest town.

OnStar immediately came to life, assuring me help was dispatched. My phone called emergency contacts and explained the situation before panic could set in. Systems designed to anticipate catastrophe responded before I could process what had happened.

I survived to see that granddaughter's wedding. One year later, almost to the day, she gave birth to my great-grandchild—the fourth generation.

The difference between my story and those forty people in Switzerland is infrastructure. Systems that catch you. The difference between a car engineered with sensors and emergency response, and a bar where celebration became catastrophe in the time it takes to draw a breath.


What Path Is Safe?

The questions we should be asking about AI aren't just "is this output accurate?" but "what path is safe?" Not just "did this model hallucinate?" but "what venue is adequately safeguarding your visit?"

We are all grains of sand. The question is whether we expect to survive floating along a random route, or whether we wish to be armed—and yes, it's the right word—with tools to arrive at our destination.

Medicine will one day work similarly. Financial advice. Legal guidance. Educational content. Every domain where information quality determines outcomes will need triangulation tools—ways for ordinary people to check whether the confident answer they just received aligns with reality across multiple independent sources.


Fifty Years of Building Systems

I wrote my first lines of Fortran in the early 1960s. I've built pension systems for Fortune 100 companies, founded companies, sold companies, and watched the entire arc of computing from punch cards to large language models.

What I've learned is this: technology that catches people before they fall requires the same patient engineering as the ocean requires to move a grain of sand across seven thousand miles. It doesn't happen by accident. It happens because someone decided that human welfare was worth the unglamorous work of building infrastructure.

I have a few waves of life remaining. I'm spending them building something that might help others navigate their own crossings—the young person checking whether information is reliable before making a consequential decision, the family verifying medical advice, the investor distinguishing signal from noise.


Don't Lament. Engage.

This is my philosophy, and it's never been more urgent. Lamenting the speed of misinformation accomplishes nothing. Wringing hands over AI hallucinations changes nothing. Building tools that give everyone—not just experts—the ability to triangulate toward truth? That's engagement.

The news from Switzerland reached me before I finished my morning coffee. Forty families began their years in grief. The information traveled at the speed of light. The healing will take generations.

But somewhere between the speed of harm and the slowness of healing, there's room for tools that help people navigate. Not perfect protection—that doesn't exist. But instruments. Charts. Ways to triangulate.

The ocean has all the time in the world.

Brian Demsey is the founder and CEO of Hallucinations.cloud LLC, an AI safety company focused on multi-model truth verification. He has over fifty years of experience in enterprise technology, including building unified benefits platforms for Fortune 100 companies.