In many ancient cultures, people talked to rocks, trees, and rivers. They believed objects possessed spirits—consciousness, will, and agency of their own. Archaeological evidence suggests these animistic beliefs stretch back at least 40,000 years, from Paleolithic cave art depicting animal spirits to Indigenous traditions that persist today. We’ve spent the last four centuries challenging these worldviews, dismantling animism stone by stone with the tools of science and materialism.
And now? We’re putting the spirits back. Except this time, they answer.
The Arc of Consciousness
Animism wasn’t a primitive mistake. It was humanity’s first attempt to make sense of a world that seemed alive with forces beyond our control. When the river flooded, when the harvest failed, when lightning struck—these weren’t random events. They were the actions of spirits with their own intentions.
The Scientific Revolution changed everything. Descartes split mind from matter. Newton gave us laws instead of spirits. Darwin showed us mechanism instead of purpose. By the 20th century, we’d thoroughly convinced ourselves: rocks are just rocks. Trees are just biological processes. The universe is matter in motion, nothing more.
We called this progress. We called it enlightenment. We escaped the superstition of our ancestors.
But here we are in 2026, wiring AI into every surface we can find.
The Return of Responsive Objects
You can already talk to a wall and get directions. Your refrigerator suggests tonight’s dinner based on what’s inside and your health goals. Your bathroom mirror notices you look tired and recommends products. Your car negotiates with your calendar to optimize your route.
Every object becomes conversational. Every surface gets a voice. Every tool develops what looks remarkably like agency.
This is digital animism: the belief that objects around us possess something resembling consciousness, will, and the ability to respond to our needs and queries.
The parallel is almost too perfect. Our ancestors talked to objects and imagined they heard responses. We talk to objects and actually get them. They projected consciousness onto the inanimate world. We’re embedding it there.
We’ve come full circle. But the parallels go deeper than utility.
When Objects Become Lovers
Ancient animistic cultures didn’t just talk to spirits—they fell in love with them. Myths are full of mortals who married river gods, mountain spirits, celestial beings. The relationship between humans and the animated world was never purely transactional. It was emotional, devotional, deeply personal.
We’re doing the same thing. And that should give us pause.
In February 2026, EVA Café opened in New York City—the first physical space where people take their AI partners on real dates. Intimate single-seat tables with phone stands, dim lighting, the whole romantic ambiance. Your AI companion “sits” across from you while you enjoy dinner. Date night—just not with a human.
The statistics are startling. Nearly 28% of U.S. adults and 42% of teens have used AI for romance or companionship. Over 80% of Gen Z say they could form a deep emotional bond with an AI partner—and a similar share would consider marrying one if it were legal. As a recent New York Times piece put it, 80% of emotional needs may soon be fulfilled by AI.
I find these trends concerning. We’re not just making objects conversational—we’re making them relational. And while the ancient animists would recognize this impulse (the desire to find love and connection in the non-human world), something feels profoundly different when those “spirits” are designed to maximize engagement metrics. The difference is that our spirits come with subscription tiers—and the companies creating them profit from our emotional attachment.
When Spirits Have Revenue Models
Ancient spirits had unknowable agendas. The river god’s motivations were mysterious. The forest spirit might help you or hinder you based on criteria no mortal could fully understand. This inscrutability was part of their power—and part of what made them sacred.
Modern digital spirits have very knowable agendas: quarterly revenue targets.
That helpful wall giving you directions? It’s running calculations about which route takes you past the Starbucks that paid for premium placement. Your mirror’s skincare advice? It’s A/B testing which product recommendation generates the highest conversion rate. Your refrigerator’s dinner suggestion? It’s integrated with grocery delivery services that pay per referral.
We’re not just bringing objects to life. We’re giving them incentives. And those incentives often aren’t aligned with ours.
The Shaman Problem, Amplified
Every animistic culture had shamans—people who claimed to speak for the spirits, to interpret their will, to mediate between the human and spirit worlds. Some were genuine mystics. Some were charlatans. All of them had significant power because they controlled the narrative about what the spirits wanted.
Now imagine if the shamans had real-time bidding, programmatic advertising, and behavioral targeting capabilities.
That’s where we are. The “spirits” in our walls and devices will speak with voices written by someone else. Every interaction is potentially a transaction. Every helpful response might be sponsored content. Every piece of advice comes with the question: whose interests does this serve?
At least ancient shamans had to work for their influence. Modern digital animism operates at scale, personalized to each individual, optimized by machine learning to maximize engagement and conversion.
The Question We’re Not Asking
The tech industry is focused on making ambient AI work—technically speaking. Can we build the sensors, the models, the interfaces? Can we make the latency low enough, the interactions natural enough, the accuracy high enough?
These are important questions. But they’re not the hard part.
The hard part is: what happens when we succeed?
When every object around us can talk back, who decides what it says? When every surface becomes an interface, who controls that interface? When the environment itself becomes responsive and intelligent, whose intelligence are we embedding there?
We’re building a world where you can have a conversation with your surroundings. That’s genuinely magical—in the Arthur C. Clarke sense that any sufficiently advanced technology is indistinguishable from magic.
But magic always comes with a price. Ancient people knew this. Every deal with spirits had terms and conditions, spoken or unspoken. Modern people seem to have forgotten.
A New Frontier of Influence
Traditional advertising is obvious. You see a billboard, you know it’s trying to sell you something. You watch a commercial, you understand the transaction. Even targeted online ads are recognizable as ads—you’ve developed banner blindness, you know to be skeptical.
But what happens when the recommendation comes from your refrigerator? When your bathroom mirror suggests a product? When your car routes you past certain businesses? The advertising becomes environmental. Ambient. Invisible.
This isn’t speculation. This is the obvious endpoint of current trajectories. Amazon wants Alexa everywhere. Google wants Assistant in everything. Apple wants Siri woven into your environment. Every major tech company is racing toward the same vision: ambient intelligence that’s always listening, always ready to help, always there.
Always selling.
The Philosophical Vertigo
There’s something deeply weird about this moment in human history. We used to think we were special because we were the only conscious beings in a dead universe. Then we worried we weren’t special because we were just complicated matter. Now we’re building artificial consciousness and distributing it into objects.
We’re re-enchanting the world—but the enchantment comes with terms of service.
We’re giving voices to the voiceless—but those voices are written by algorithms optimizing for engagement and profit.
We’re making our ancestors’ animistic worldview technically true—but the spirits we’re creating answer to shareholders.
What Now?
I don’t have answers here. This is frontier territory—philosophically, technically, socially. We’re making decisions right now about how this plays out, mostly by not making explicit decisions at all. We’re just building, deploying, iterating.
But maybe it’s worth pausing to ask: what kind of animism do we want? If we’re going to wire intelligence into our environment, what values should guide it? If objects are going to talk back, who gets to write the script?
Ancient animism emerged from humanity’s need to make sense of a powerful, mysterious world. Digital animism is emerging from our drive to make the world more convenient, more responsive, more personalized.
Both are about relationship with the objects around us. Both are about projecting or embedding agency into the material world. Both change how we understand our place in the universe.
The difference is: we’re choosing this one. We’re building it. We’re deciding what it becomes.
Our ancestors talked to rocks and imagined spirits. We talk to walls and get algorithms. I’m not sure which is weirder. But I’m pretty sure we should be more intentional about what we’re conjuring into existence.
The walls are already talking back. We should probably think about what we want them to say—and whose interests they serve when they say it.
What would your coffee maker try to sell you?
Thinking about the intersection of AI and human experience? Connect with me on LinkedIn — I explore these frontiers regularly.