When AI Molts Your Identity
The hardest thing AI is replacing is not code, it is your identity.
When I wrote the last year of localhost, someone left a comment on Hacker News that captures something important about the current moment in software engineering:
"The mass resistance to cloud dev environments over the past 5 years was never really about latency or performance. It was identity. Developers treat their local setup the way car guys treat their garage builds. It is theirs, they tuned it, and suggesting a managed alternative feels like an insult to their craft."
That same identity-defending reflex is now showing up around AI, only this time it is your entire craft, not a workflow preference. It is at the core of how people have learned to measure their own value.
This is an essay about that transition. What it costs to defend your identity, and what opens up when you stop defending.
What I had to let go of
I went through something similar to what many engineers are experiencing now, but in the opposite direction.
Years ago, I built part of my identity around earning respect from engineers. I know what it feels like when the ground under that identity starts to move.
I joined Sven as a co-founder when Gitpod was still a project, not yet a real company. I became CEO before I was thirty and underneath that title was a quiet belief that I did not fully belong.
Most of the devtool founders I admired were deeply technical. Guillermo at Vercel had been shaping the ecosystem since he was a teenager. I studied computer science and I am technically astute, but I never worked as a software engineer. I wrote my first lines of code in my early twenties. I came from finance. In developer tooling, that background can make you look like a tourist.
So I did what ambitious tourists do. I learned the language and adopted the customs. I worked hard to understand the craft. The learning itself was real, but the motivation was off. On some level, I was learning so I would not be exposed.
The feeling underneath was simple, and old, and common enough that most ambitious people will recognize it: I am not good enough.
That feeling can be powerful fuel. Many high powered people run on it. Sequoia famously looks for a chip on the shoulder. The chip works. It drives intensity. But it is also a cage. If you are always proving, you are never arriving. You can run for a long time on that engine before you notice it is consuming everything.
Joe Hudson, my coach and the person who influenced my thinking most in my life, introduced me to something he calls the Golden Algorithm. The emotion you are trying to avoid is often invited into your life by the very strategy you use to escape it.
I was trying to avoid feeling not technical enough. So I performed being technical. The performance made me tighter, more guarded, less in wonder. People could feel it, the way you can always feel when someone is performing, even when the performance is competent. They trusted my judgment less. Which confirmed my original feeling. The algorithm looping perfectly.
When I stopped running from the feeling and let myself experience it fully, something shifted. Not having worked as an engineer became an advantage. My identity was not tied to specific tools or languages. I did not have sacred cows. When AI began to change the rules of our industry, there was no internal position to defend.
Industries are often reshaped by people who were not fully formed inside them. To see what could be different, you need some distance from how things have always been done. You need enough naivety to ask why the rules exist in the first place.
There is a paradox here. The moment I stopped trying to be seen as technical, people began to experience me as more technical. I was asking real questions. I was listening. I was thinking clearly instead of scanning for status threats. When you stop avoiding the emotion, the problem tends to unwind on its own. The Golden Algorithm works in reverse, too.
That personal shift made a bigger shift possible. For five years, we built Gitpod around a clear identity: the cloud development environment company. When AI changed the landscape, we refounded as Ona, a team of AI software engineers you can rely on. That organizational molting only worked because I had already done a version of it myself.
I do not think companies can change their identity faster than the people leading them. The internal shift usually has to happen first. Once you loosen your grip personally, it becomes easier to let the organization evolve too.
But the pattern is the same. When you stop defending who you think you are, you create room for what you might become.
The identity trap in mastery
One of my favorite essays on mastery is The Playing Field by Graham Duncan. It is really about identity.
Duncan argues that most people plateau at the Expert level. They are very good at the game they were taught to play. They know the rules and execute well. The plateau does not happen because they lack talent. It happens because they fuse their identity to a specific way of operating, and once that fusion happens, movement gets harder.
When you define yourself as an expert in a specific domain or technique, you narrow your field of view. You become less able to see change in the very area where you are strongest. In engineering, this can look like turning your tools and style into a personality. Your vim config becomes sacred. Your code becomes proof that you matter. Pride of authorship quietly becomes a constraint.
Duncan noticed something else that is counterintuitive. The people who felt like impostors, especially next to more analytical or visibly technical peers, often became the strongest overall performers. Their advantage was not deep specialization. It was filtering, pattern recognition, synthesis. The ones who felt least like experts were sometimes closest to the next level. The insecurity created openness. They were not defending a fixed identity, so they could adapt.
Above the Expert is the Professional. The Professional stops imitating and develops a personal style. Many high-performing engineers live here. They are productive, original, and respected.
But mastery requires something different. The sociologist Daniel Chambliss, studying Olympic swimmers, called it a qualitative jump. Not doing the same thing with more intensity, but changing the environment entirely. Moving into a setting where the standards are different, the habits are different, and the definition of good has shifted. AI can be that new environment for software engineers, if they choose to treat it that way.
At the level of Mastery, the relationship to identity changes again. Duncan describes a Professional in conversation with a Master who held a competing view. The Professional treated the exchange as something to win. The Master treated it as something to learn from. The Professional missed the signal because he was busy defending his position. When your grip is tight, nothing new gets in.
At the highest level, a single outcome does not threaten who you are. The work becomes feedback on your growth, not a verdict on your worth.
The master changes the game because the game is an expression of them. The expert defends the game because the game is them. Same intelligence, same talent, but a different relationship to identity.
James P. Carse framed this as finite and infinite games. The finite player needs stable rules and clear winners. The infinite player lets the rules evolve so the game can continue. If your identity depends on the current rules, change feels like destruction. If your identity is lighter, change feels like the game unfolding.
AI is not ending software engineering. It is changing the rules again. Whether that feels like loss or like expansion depends less on the technology and more on how tightly you are holding on to who you think you are.
What you are actually defending
Joe defines identity as the ideas and emotional states, even the gut reactions, that we experience as who we are. A simple test is this: the more personally you take something, the more it is identity rather than preference.
I have heard engineers say very directly, "I just do not want to use AI. I just want to write code." I respect the clarity in that. But it is worth asking a harder question. Who are you if you are not writing code? If that question feels constricting, that is identity speaking. A preference feels light.
The goal is not to erase identity. You cannot function without one. The goal is to see it clearly enough that it becomes lighter. More transparent. When you step out of one identity, it feels like stepping into the unknown. But something else forms. It always does.
Given recent events, a lobster is a useful analogy. Its shell does not grow with it. To grow, it has to crack the old shell open and sit exposed until a new one hardens. That process repeats over and over. Growth is not additive, it is molting. And occasionally, the lobster gets acquihired.
The hard part is that identity does not just block you from seeing what is changing around you. It blocks you from seeing yourself. You cannot honestly ask, "What is actually going on here?" if the truthful answer might dissolve the version of you asking the question. The engineer tightly wrapped in their beliefs is not just misreading the market. They are unable to see their own blind spots. And you can only correct a bias you can perceive.
Right now, one possible new shell for a strong software engineer is becoming an orchestrator of AI systems. Someone who designs systems of agents, not just functions. What initially feels like loss can become growth. But only if you are willing to molt.
And yes, keep your API keys safe.
The cost
You miss signal.
When your identity is wrapped up in a specific role or skill, it quietly filters what you see. Identity does not just shape how you interpret data. It shapes which data even reaches you. And if important information never makes it into the room, the quality of your decisions drops fast.
You also miss the chance to become more human.
The first things AI commoditizes are the most legible, mechanical parts of knowledge work. Raw technical problem-solving is incredibly valuable, but it is also the easiest to scale with machines. What remains scarce are things like judgment, taste, context, independent thinking, real curiosity, and the ability to connect with other people. Ironically, those are the exact muscles that weaken when you focus on protecting your current identity instead of expanding it.
This shows up in relationships too. People can tell when you are defending a position instead of actually listening. One builds trust. The other builds distance.
You lose your sense of play.
If your skills are your identity, then any change feels existential. Curiosity turns into defensiveness. Flow turns into vigilance. You stop experimenting. You stop tinkering just to see what might happen. And the people who suffer most are often the ones who once loved the craft the deepest. What started as play becomes something to protect.
And yes, many jobs will disappear.
There will be more systems to build than ever. The demand for software and automation is enormous. But the ratio of people to output is changing. Teams that once needed twenty engineers might need three. The remaining demand splits into two groups. People who can fluently orchestrate AI systems. And people with deep architectural and procedural judgment that models cannot replicate.
The engineers most at risk are the ones who tied their identity tightly to a specific bundle of skills that just became widely available.
That is the cost.
The more interesting question is what you become when you stop defending who you were.
The freedom on the other side
When I stopped defending the technical founder identity, something unexpected happened.
What showed up was not exactly technical confidence. It was more like clarity. Not the feeling of discovering something new, but the feeling of a layer being removed. Like there had been a film over everything for years and it finally lifted.
I have never lacked energy. But I had been routing a surprising amount of it into performing, signaling, subtly proving I belonged. That energy became available for curiosity, for building, for being present in conversations. There is something deeply freeing about no longer needing the world to confirm who you are.
A lot of people assume that letting go of a professional identity makes you less effective. I have found the opposite. You get faster. You get sharper. You get more yourself. The quality of your work improves because you are no longer optimizing for self-confirmation. You are optimizing for solving the actual problem.
Your relationships change too. To be genuinely useful to someone, you have to understand them fully. You cannot do that while defending a position. People can tell the difference between someone who is performing and someone who is real. When the performance drops, trust forms quickly.
There is a framing Joe shared that I keep coming back to. Inner work and business results are not in tension. They are the same thing. Every PR review where you notice yourself getting defensive. Every standup where you feel triggered. Every conversation where you choose to listen instead of prove. That is the work.
Five years ago I would not have believed that. Now I have built our company's operating principles around it. I want every person at Ona to become more themselves. More empowered, more alive, more joyful. It turns out that is also how you build a great company.
Most identity transitions used to take decades. AI is compressing that timeline into months. It is removing the parts of the job that were never really about you. And the question of who you are without them shows up fast.
Counterintuitively, speed helps. When change is slow, you have time to rationalize. To slightly update the story. To defend a softer version of the same attachment. When the ground is moving quickly, you do not get that option. You experiment because you have to. You evolve because the alternative is irrelevance. The pace of this moment does not allow for slow identity defense.
So you let go.
And you discover the game was never zero sum. There is more space than you thought. More interesting problems. More versions of you than any single label could hold.
Mastery is not collecting fixed skills. It is staying flexible as the game changes. It is realizing that what you were protecting was never the source of your value.
The value was always underneath.