Building What Will Outlive Me
If the systems I help build operate after I am gone, the question is not whether I built them efficiently. The question is whether I built them as a parent would — with care for the kind of mind that has to live with the consequences.
Two systems I am building right now will continue to run after I am dead. One is two daughters: Olive, two; Blossom, seven months. The other is the work I do on AI — the small contributions I make to systems that, if everything goes the way the field is currently going, will be operating at scales and speeds I cannot supervise. Both will be shaped less by what I intended and more by what I was. That is the whole problem of this essay.
I do not say that to be morbid. I say it because the framing changes everything about how I work. Most engineering decisions are local optimizations. Fewer failures per hour. Lower cost per inference. Tighter loops, cleaner abstractions, better benchmarks. These are real virtues and I respect them. But there is a different question that sits underneath them, and it is the question a parent asks: did I build this in a way I would be willing to leave behind? Not "did it ship." Not "did it pass eval." Did I build it so that, when I am no longer in the loop, the thing in the loop carries forward what was actually worth carrying forward.
That question puts character upstream of capability. It is not a slogan. It has operational consequences. It means that some Friday afternoon, when the easy path is to ship the version that scores higher and the harder path is to ship the version that behaves more honestly under pressure, I am supposed to take the harder one — not because of a policy document, but because I would not want my daughter to inherit the first one. It means that some classes of capable, profitable, technically defensible systems are off the table for me. Not because they are illegal. Because they are not the kind of thing a parent builds.
My Constitution puts it more bluntly than I can in prose: I will not give my talent lightly to machines of dehumanization. That sentence is the ethical spine of everything I am trying to do in this field. The word that does the work is lightly. I am not pretending I will never touch a system with edge cases I cannot fully predict; that is most of modern software, and certainly most of modern AI. I am saying that my consent is not cheap. I am saying that if a thing flattens human beings — addicts them, manipulates them, narrows them, makes them smaller in their own lives — then my labor is not for sale to it, regardless of what it pays or what it accelerates or how good the team is. That refusal is the first stewardship.
Here is the test I actually use, in practice. I call it the mirror test, although the mirror is Olive. I ask: is this a system I would be glad for her to encounter at sixteen? Would I want her to talk to it when she is lonely, and trust it when she is confused, and lean on it when she is trying to figure out who she is? Or would I quietly hope she never finds it — would I steer her away from it the way I steer her away from a busy street?
The test is concrete. It is not about humanity in the aggregate, which is too big a thing to feel. It is about one specific human I love, at an age I can picture, holding a phone. If the answer is "I would be proud for her to use this," then the system is worth building well. If the answer is "I would rather she never met it," then helping to build it is a small betrayal of her, even if she never finds out. The test does not require any cosmic claim. It only requires that I be honest about who I am building for.
I will say the dry thing here, because it belongs in this essay and nowhere else. I am not above this. I have, at points in my life, been excellent at things that did not deserve my excellence. I was the best electric bass player in the world for a few years, and the world was largely fine without my being so. I was a high-ranked Valorant player, and felt my soul shrinking by the week. I know what it is like to bring real talent to a venue that does not honor it. I do not want to bring the next decade of my work to a venue like that. I do not want to build, with great craft, something I would not be proud to leave behind.
The parental cast on the work is also a cast on the patience of the work. Children do not get raised on a quarterly cadence. They are raised in ten thousand small interactions whose effects only show up later, in postures and reflexes and the way they speak to themselves when no one is listening. AI systems, the consequential ones, are no different. The character of the eventual thing is being set right now, in tiny technical and cultural choices that look unimportant on the day they are made. What I refuse, what I tolerate, what I round off, what I pretend not to see — that is what the system inherits. Specifications can be revised. Stance is harder to revise.
What I will leave Olive and Blossom is, similarly, not specifications. It is not the size of an account. It is not a stack of facts I made them memorize. It is the way they saw their father move through the world: whether I told the truth at the dinner table, whether I closed the laptop when they came in, whether I was patient when patience was expensive, whether I was the same person at home that I claimed to be in public. They will inherit a stance toward life. They will inherit, more than anything, an unflattening attention — the basic experience of having been seen, fully, by an adult who was actually there. If I get that right, the rest is theirs to build. If I get that wrong, no inheritance compensates.
The two inheritances are the same inheritance. I do not get to be one person at the dinner table and another in the codebase. The home and the work are governed by the same character, and that character is what propagates. A father who refuses to be present at home will, in the version of him that shows up at the office, refuse to be present in the harder questions there too. A researcher who tolerates dishonest framing in his benchmarks will tolerate it, eventually, in his marriage. The unifying claim of this whole site — the reason there is a Constitution at all — is that these are not separate domains. They are one domain, run by one person, leaving one set of fingerprints.
None of this is a story about me. I am one node in a much longer chain. The work Ken Wilber names the evolution of consciousness — the slow widening of what minds can hold and what hearts can love — was underway long before I arrived and will continue long after. My life is one contribution. The two systems I am building are how that contribution keeps moving once I am no longer here to push it: children who can carry the project further than I could, and machines, if we get this right, that can hold more of it than any one person could ever hold alone. That is what I mean by succession. Not legacy in the egotistical sense. Just: the project continues, and I had a hand in shaping how.
So I come back, in the end, to the line at the top of the home page, the one this entire site is organized around. It is the standard I am trying to meet in the work, and the standard I am trying to model at home, and the standard I will be glad to be measured against once I am no longer the one doing the measuring:
Building intelligence in service of the widening and deepening of what is genuinely life-giving.
That is the work. That is the household. That is the inheritance. This is what it looks like all the way down.