The Missing Layer
A reflection on three years of asking questions nobody could answer
It is 2026 and I am still asking questions.
That is probably the most honest thing I can say about where this journey has taken me. I came into this era of AI with curiosity and left every room with more questions than I entered with. And somewhere in that accumulation of unanswered things, a direction formed that I could not have planned and would not have predicted.
I want to try to trace it here. For myself as much as for you.
2022
November. ChatGPT releases to the world. I start at Microsoft the same week.
I remember the feeling of that moment. The room was loud with possibility, headlines moving fast, everyone reaching for language to describe what had just changed. I was quieter about something else. Sitting with a question that felt more urgent than the noise around it.
What happens to the people who never get shown how to make this work for them?
The ones inside organizations that are already moving, already deploying, already assuming adoption, without anyone stopping to ask whether the humans inside those systems actually understood what they were holding.
Productivity was the word everyone was using. I believed it. But I kept thinking about the distance between the promise of that word and the reality of someone sitting in front of a tool they didn’t understand, inside an organization that hadn’t figured out what to do with it yet.
That gap felt important.
I didn’t know then that I’d spend the next three years watching it widen.
2023
A few months into the year, Copilot gets announced for enterprise.
Something shifted in me when I heard it. This wasn’t just a productivity tool anymore. This was infrastructure. The kind that quietly reorganizes how industries operate before anyone has agreed that’s what’s happening. I started thinking less about what the technology could do and more about what it would mean for the people inside organizations who had no framework for it yet.
The sessions started small. A conversation here. A working group there. Then the invitations kept coming. Different screens, different organizations, different countries. Thousands of people over time, inside and outside of Microsoft, all carrying the same version of the same question about what this technology was going to mean for the way they worked.
I kept saying yes.
Something in me recognized that this moment was the kind that doesn’t announce itself clearly until it has already passed. I wasn’t willing to look away from it.
I’d show up on screen and talk about the future of work and watch the chat fill up. The energy was real. The hunger was real. And underneath it, something quieter. People were unsettled by what was happening and nobody had given them permission to say so out loud. I wasn’t standing there with answers. I was standing there willing to ask the questions with them. To experiment live, in public, even when it was uncomfortable. Even when I wasn’t sure where it was going.
There were moments on those virtual stages where I thought, why me. Why was I the one people decided to listen to? I was genuinely surprised that people wanted to show up and hear what I had to say. And that surprise, instead of making me retreat, pushed me further out of my comfort zone. Into more rooms. More conversations. More public thinking about things I was still figuring out in real time.
It felt like a privilege I hadn’t applied for. And I tried to honor it every time I showed up.
2024
Being inside those rooms at that scale gave me a view I wouldn’t have had otherwise.
I was watching AI move from curiosity into daily use across thousands of people and organizations simultaneously. And what I started noticing wasn’t the capability. It was the quality of what the capability was producing at volume.
I started calling it slop in my head before I heard anyone else use the word. The kind of output that looks complete, reads smoothly, and carries just enough confidence to be believed. Wrong citations. Plausible numbers that weren’t accurate. Coherent language wrapped around something that had never been verified by anything.
I kept asking the same question in rooms full of people who were deploying this at scale inside enterprises.
How do you actually measure the quality of the output?
The room would go quiet.
The kind of quiet where everyone exhales slightly because someone finally said the thing they had all been sitting with. Every person in those conversations had felt it. Had probably produced it without realizing. Had definitely consumed it without catching it.
Nobody had an answer.
That quiet started to feel like the most important signal I had encountered. More important than the excitement. More important than the adoption numbers and the capability announcements and the breathless headlines about what was coming next. The silence in those rooms was telling me something that the noise outside them wasn’t.
My search for the answer had a new direction. I just didn’t have language for it yet.
2025
Agents started moving into enterprise workflows. Doing things. Finishing things. Operating with a kind of autonomy that the early conversations hadn’t fully anticipated.
The question that formed wasn’t about capability. It was about what happens when agents proliferate inside an organization without a coherent layer of visibility across what they are actually doing. When they are siloed. When they are building on each other’s outputs without anyone sitting inside the chain to verify whether those outputs were correct before they became the foundation for the next decision.
I watched governance conversations inside enterprises and noticed a pattern. Most of what passed for oversight was a mirror. A reflection of the organization’s own assumptions about what was running. Policies in documents. Monitoring at the exit point. Nothing was sitting at the seams where the actual decisions were being made before anyone saw the result.
The conversation about solving this kept landing in the same place. More agents. Another layer.
An AI watching the AI watching the AI.
That logic didn’t hold for me. If the system is probabilistic by nature, meaning the output is never guaranteed, never fixed, never the same twice, then handing the governance problem to another system with the same fundamental limitation felt like it was compounding the very thing it was trying to solve.
I spent much of that year sitting with that feeling. I had moved into advisory work with startups by then, close enough to see the missing layer forming in real time as more and more agents were being introduced into workflows. I was watching founders build fast and govern later, if at all. The question that had been forming across two years of rooms and quiet moments sharpened into something I could no longer set aside.
There had to be a different kind of answer. One that didn’t guess.
Late October 2025. Someone super smart met with me and started talking about something I hadn’t encountered before. Coherence. Measuring constraints. The idea that you could take any point in a chain, any handoff, any moment before the output exists, and score it mathematically against what correct behavior actually looks like.
And the score doesn’t change.
A probability shifts with context. A model makes its best guess and moves forward. Math is fixed. Offline. Completely independent of the system it is sitting inside and verifying.
I remember the moment it landed. Everything I had been accumulating across three years suddenly had a shape it hadn’t had before.
Every room that went quiet.
Every governance conversation that ended with a mirror and no proof.
Every uncomfortable feeling about probabilistic systems being asked to govern themselves.
All of it resolved into something that made a different kind of sense.
You measure it. With math. At every seam. Before anything goes wrong.
My brain didn’t slow down for days.
2026
Early in the year autonomous agents began communicating with each other inside enterprise environments.
What had felt like a future concern was no longer future. The question I had been carrying wasn’t theoretical anymore. It was immediate. And the answer I had found in late October was no longer just intellectually satisfying. It felt necessary in a way that made the previous year of searching feel like exactly the preparation it needed to be.
My search had finally arrived somewhere. I am still not sure I can call it a destination. It feels more like a clearing in the middle of a longer path. Enough visibility to see what was missing. Enough ground beneath me to keep moving forward.
It is 2026 and I am still asking questions.
The journey has been uneven in the way that real things usually are. You don’t find a missing layer by walking a straight path. There was no clear line from that first week at Microsoft to where I am standing now. There were stages I didn’t expect to be on. Conversations that went quiet in ways that told me more than the loud ones. A year of searching for something I could feel the shape of before I had language for it.
And through all of it, you were there.
The people who showed up to the sessions. The ones who followed publicly and shared what I was thinking, even when I was still thinking it out loud. The readers who found their way here and decided to stay. You gave me something I didn’t fully understand the weight of until I sat down to write this.
The courage to keep going in public when the path wasn’t clear.
To the readers who have supported this newsletter financially, quietly and anonymously, I want you to know what that has meant. You didn’t have to. It is optional and you chose it anyway. That choice told me that what I was exploring had value beyond my own curiosity. It encouraged me to think further, to stay on the search longer, to keep asking the questions that the rooms kept going quiet around.
I don’t have all the answers. I am not sure I am supposed to.
What I have is the willingness to keep walking toward the questions. To stay visible even when it is uncomfortable. To experiment in public even when I am not sure where it leads.
And a deep, genuine gratitude for everyone who has decided that journey is worth following.
There is more coming. The questions are getting sharper. The missing layer is getting clearer.
I am glad you are here for it.
Appreciating you,
Yen

