There's a feeling you get
in the presence of
beautiful buildings and bustling courtyards.
A sense that these spaces
are inviting you to slow down,
deepen your attention, and be
a bit more human.
What if our software could do the same?
——
We shape our environments, and thereafter they shape us.
Great technology does more than solve problems. It weaves itself into the world we inhabit. At its best, it can expand our capacity, our connectedness, our sense of what's possible. Technology can bring out the best in us.
Our current technological landscape, however, does the opposite. Feeds engineered to hijack attention and keep us scrolling, leaving a trail of anxiety and atomization in their wake. Digital platforms that increasingly mediate our access to transportation, work, food, dating, commerce, entertainment—while routinely draining the depth and warmth from everything they touch. For all its grandiose promises, modern tech often leaves us feeling alienated, ever more distant from who we want to be.
The people who build these products aren't bad or evil. Most of us got into tech with an earnest desire to leave the world better than we found it. But the incentives and cultural norms of the tech industry have coalesced around the logic of hyper-scale. It's become monolithic, magnetic, all-encompassing—an environment that shapes all who step foot there. While the business results are undeniable, so too are the downstream effects on humanity.
With the emergence of artificial intelligence, we stand at a crossroads. This technology holds genuine promise. It could just as easily pour gasoline on existing problems. If we continue to sleepwalk down the path of hyper-scale and centralization, future generations are sure to inherit a world far more dystopian than our own.
But there is another path opening before us.
——
Christopher Alexander spent his career exploring why some built environments deaden us, while others leave us feeling more human, more at home in the world. His work centered around the "quality without a name," this intuitive knowing that a place or an architectural element is in tune with life. By learning to recognize this quality, he argued, and constructing a building in dialogue with it, we could reliably create environments that enliven us.
We call this quality resonance. It's the experience of encountering something that speaks to our deeper values. It's a spark of recognition, a sense that we're being invited to lean in, to participate. Unlike the digital junk food of the day, the more we engage with what resonates, the more we're left feeling nourished, grateful, alive. As individuals, following the breadcrumbs of resonance helps us build meaningful lives. As communities, companies, and societies, cultivating shared resonance helps us break away from perverse incentives, and play positive-sum infinite games together.
For decades, technology has required standardized solutions to complex human problems. In order to scale software, you had to build for the average user, sanding away the edge cases. In many ways, this is why our digital world has come to resemble the sterile, deadening architecture that Alexander spent his career pushing back against.
This is where AI provides a missing puzzle piece. Software can now respond fluidly to the context and particularity of each human—at scale. One-size-fits-all is no longer a technological or economic necessity. Where once our digital environments inevitably shaped us against our will, we can now build technology that adaptively shapes itself in service of our individual and collective aspirations. We can build resonant environments that bring out the best in every human who inhabits them.
——
And so, we find ourselves at this crossroads. Regardless of which path we choose, the future of computing will be hyper-personalized. The question is whether that personalization will be in service of keeping us passively glued to screens—wading around in the shallows, stripped of agency—or whether it will enable us to direct more attention to what matters.
In order to build the resonant technological future we want for ourselves, we will have to resist the seductive logic of hyper-scale, and challenge the business and cultural assumptions that hold it in place. We will have to make deliberate decisions that stand in the face of accepted best practices—rethinking the system architectures, design patterns, and business models that have undergirded the tech industry for decades.
We suggest these five principles as a starting place:
We, the signatories of this manifesto, are committed to building, funding, and championing products and companies that embed these principles at their core. For us, this isn't a theoretical treatise. We're already building tooling and infrastructure that will enable resonant products and ecosystems.
But we cannot do it alone. None of us holds all the answers, and this movement cannot succeed in isolation. That's why, alongside this manifesto, we're sharing an evolving list of principles and theses. These are specific assertions about the implementation details and tradeoffs required to make resonant computing a reality. Some of these stem from our experiences, while others will be crowdsourced from practitioners across the industry. This conversation is only just beginning.
If this vision resonates, we invite you to join us. Not just as a signatory, but as a contributor. Add your expertise, your critiques, your own theses. By harnessing the collective intelligence of people who earnestly care, we can chart a path towards technology that enables individual growth and collective flourishing.
Explore & contribute to the theses of resonant computing
Give feedback on the manifesto
The following individuals drafted and released this manifesto:
Maggie Appleton
Samuel Arbesman
Daniel Barcay
Rob Hardy
Aishwarya Khanduja
Alex Komoroske
Geoffrey Litt
Michael Masnick
Brendan McCord
Bernhard Seefeld
Ivan Vendrov
Amelia Wattenberger
Zoe Weinberg
Simon Willison
The following individuals have signed in support:
Tim O'Reilly
Kevin Kelly
Bruce Schneier
Hiten Shah
Eric Ries
Joel Lehman
Packy McCormick
Danielle Perszyk
Jim Rutt
Peter Wang
Brad Burnham
Kent Beck
Eugene Wei
Chad Kohalyk
James Edward Dillard
Ben Mathes
Goblin Oats
Chris Lunt
Curran Dwyer
Ben Follington
Stuart Buck
Bridget Harris
Chad Fowler
Kyle Morris
Sean Thielen-Esparza
Janfj
Yatú Espinosa
Alex Zhang
Anna Mitchell
`Steve Kirkham
Scott Moore
Jason Zhao
Jad Esber
Joel Dietz
Uri Bram
Lola Agabalogun
Tony Espinoza
Arjun Khoosal
Tony Curzon Price
Maximilian Eusterbrock
Beth Anderson
Anastasia Uglova
Jordan Erlends
Samuel Robson
Andrew Conner
Menno Schaap
Philipp Banhardt
Berlynn Bai
Arun
Louis Barclay
Gabriel Raubenheimer
Roman Leventov
Corey James
Ben Mayhew
Kyle Cox
Pierre Chuzeville
Lucabrando Sanfilippo
Jai Gandhi
Carsten Peters
Raghuvir Kasturi
B. Scot Rousse
Ilan Strauss
Yash Sharma
Sean McKeon
Gurupanguji
Zoë Chazen
John Luther
Blain Smith
Menelaos Mazarakis
Konstantinos Komaitis
Eddy Abraham
Justin Mares
Aastha JS
Marisa Rama
Seb Agertoft
Christina Kirsch
Peter Voss
Shoumik Dabir
Mike McCormick
Riley Wong
Matt Hawes
Michele Canzi
Matt Jones
Jonathan Lebensold
Francisco Javier Arceo
Noah Ringler
Simone Cicero
Simon Taylor
Lex Sokolin
Erika Rice Scherpelz
Sahar Mor
max bittker
Avni Patel Thompson
Chaim Gingold
Matt Ziegler
Daniel Hatkoff
Kamran Hakima
Rupert Manfredi
Mark Moriarty
Rohit Krishnan
Jordan Rubin
Rebecca Mqamelo
David A Smith
Chenoe Hart
Rob Flickenger
Michael Lapadula
Dan Garon
Sean Lynch
Michael Tanzillo
Reggie James
Sam Barton
Anthea Roberts
Andrew Rose
Kevin Roark
Matt Holden
Leon Markham
Roy Bahat
Substantive changes that have been made to this manifesto:
11/18/25 - Changed several instances of the word "user," to "people" or other humanistic alternatives. The word user carries heavy connotations of addiction.
10/28/25 - Updated the first principle (private) to include more nuanced language around the ownership of data. People must be the primary stewards of their context, but every system has multiple stakeholders.
10/28/25 - Updated the second principle (dedicated) to include the "contextual integrity" privacy model.
10/27/25 - Added header artwork and poetic introduction.