Skip to main content
A minimal digital timer showing 08:00, the length of the interview

Justin Bartak · AI Transformation · May 7, 2026 · 13 min read ·

The CTO Who Flew Me Out Couldn’t Spare Eight Minutes

One bad interview is data. The pattern behind it is the story.

TL;DR

After 20 years and 5+ C-suite roles, this was the worst leadership interaction I have ever seen. A CTO flew me in for a senior AI native company transformation role and could not spare eight minutes. The pattern is bigger than one company. Mid-tier vendors claim AI native and build the opposite. The team is a reflection of its leader.

I’ve never called out a company in writing in my life. I never thought I would.

After twenty years, 5+ C-suite roles, and founding multiple companies, I’ve sat across from plenty of people who didn’t know what they were talking about. I’ve always kept it to myself.

Bad experiences happen. Mismatches happen. You move on.

This one I can’t move on from.

What happened in Salt Lake City and Lehi at the WeWork office this week was so far outside the boundaries of basic professional respect that it had to be said. Not for me. For the next senior leader and builder who gets pulled into the same pattern, prepares for days, flies out at the company’s expense, and gets treated like garbage by people who couldn’t articulate AI native if they tried.

I’m not naming them. The story is bigger.

The pattern

Companies all over the market are running the same play right now. They have legacy software and a long-tenured team. They’ve heard “AI native” enough times to want it on a slide. So they recruit a senior AI leader and builder, fly him in, and try to figure out what to do with him.

Most of them don’t actually know what AI native means. AI native isn’t a chatbot in the sidebar or a co-pilot in the settings page. AI native means the platform was built from the ground up with AI as the core. Architecture, data model, workflows, agents, testing, failure handling, all of it designed around the assumption that AI is doing real work in real time inside the system. Most of the companies I’ve talked to in the last six months who say they’re AI native are bolting chatbots and features onto legacy software and calling it AI transformation. That’s not transformation. That’s marketing theater for users, the board, and possible investors.

And almost without exception, these companies show up arrogant. Not Apple-arrogant or Stripe-arrogant or Tesla-arrogant. Those companies have earned a swagger. I’m talking about mid-tier B2B vendors who happen to have a Fortune 500 logo or two on their site, walking around like they’re untouchable. The disconnect between how they actually operate and how they carry themselves is staggering.

This week was the worst version of that I’ve ever seen.

What real leadership looks like

The contrast is the point. I could name many. These four set the bar.

Levi Morehouse

CEO, Taxa · President, Aiwyn

“What are you seeing? Let’s look at this together.” The default response when I told him something he didn’t expect. Curious.

Chris Furlong

CTO, Taxa / Aiwyn

Didn’t want yes-men. Wanted the harsh reality, then a calm conversation. Treated friction as fuel. Listened.

Mike Kaeding

CEO, Norhart

Day one, handed me a credit card and the keys to every building. Hire the right people, then give them the keys to the castle. Trust.

Steve Jobs

Apple

My partner and I pushed back on him. He paused. Thought. Said “I can fix this,” and gave us the dev team for a week. Action.

The common thread is trust and curiosity. The most demanding people I’ve worked with have also been the most curious. That isn’t a coincidence. It’s the thing.

All four hired people smarter than them and got out of the way. The CTO this week couldn’t even sit through someone trying to share what they knew. That’s the difference.

When you build your identity around the work instead of the title, smarter people become a resource, not a threat.

That is what leadership looks like.

What this week looked like

A recruiter reached out cold. No screening call, no vetting conversation. Just a calendar invite to a 30-minute call with a principal engineer. The first five minutes of that call, the principal engineer was fumbling because he couldn’t find my resume. Already past whatever screen most companies run, and the person they’d handed me to wasn’t even prepared.

The 30 minutes wasn’t really an interview. It was me teaching him.

The moment I mentioned I’d embedded AI native agents into my app, his posture changed. “How did you do that?” The rest of the call was me walking him through how I architected the agentic layer, how I structured the testing, my 31 dimensional tests, my Claude.md files, my 9,000 audits, the safeguards that keep the codebase clean as it grows, and my prompting techniques for getting solid results every time.

Then he asked, “How do you deal with bugs?” I’d just told him. So I told him again, and added that I work in Next.js because the training corpus is significantly larger than what they were on. He said they were on Angular. I told him they’re going to have issues. I’ve seen it with other companies. The corpus is smaller and the model has less to reason against. He pushed back. Said he hadn’t seen any issues with Angular.

Sit with that for a second. The question he’d just asked, immediately before pushing back on Angular, was how do you deal with bugs. He’s asking the AI native expert who has done this how to fix the bugs his team is hitting. Then in the same conversation, he tells me Angular isn’t the problem. Either the bugs are fine and he doesn’t need help, or the bugs are real and he should listen. Both can’t be true.

The call ended abruptly. No next steps. No wrap-up. Just “I gotta go.”

After that, the company recruiter followed up. I liked her. Smart. Knew what was going on. The type of person I would hire. She asked if I got my questions answered. I said no. I didn’t know anything about the role or the company. He just wanted to talk about what I’d done in AI.

In fifteen minutes she gave me more useful information about the role, the team, and the structure than the principal engineer or the CTO would over the next several days combined. She told me it was a consortium of four to five AI experts reporting to the CEO. Heads of AI. AI transformation leads. Senior peers. A real seat at a real table. That was the deal I agreed to. That’s why I got on the plane.

A few days later, I was on a plane.

The assessment

I showed up at the WeWork 15 minutes early. Nobody came to get me. The front desk had no idea who was who. Someone finally came out five minutes after the start time and walked me back to a small office with two other engineers.

They started with a whiteboard exercise. I asked if we really needed the whiteboard. I could just explain how I’d build it. They said okay. I walked them through my approach in maybe ten minutes. They said okay again and dropped it. No follow-up. No probing. No challenge. They moved on.

The next exercise was to fix an issue with Claude Code. A single page with an upload button. Files with eight columns worked. Files with more columns threw an error.

I asked whether this was a bug they were actively trying to fix or one they couldn’t figure out. They were evasive.

Either way, the exercise was simple. With the environment ready and the input file in hand, I could have fixed it in a fraction of the time we sat there. With my own laptop, I could have built the whole thing from scratch in Next.js, with their bug fixed, in less time than they spent fighting their own setup.

But it was their code, their setup, their environment. They could not get it running, and they had to get it running for me to prompt it at all.

It went like this the entire session. I’d start to prompt. “Oh, we forgot this.” Patch. I’d prompt again. “Oh, it needs Node.” Patch. “Oh, the API key isn’t logged in here.” Patch. “Oh, we need NPM.” Patch. The cable wouldn’t reach. It went on and on.

They never got it ready. I never ran the demo. I never got the input file.

Halfway through one of my prompts, while they were still fixing the environment in front of me, I asked Claude on the screen they were watching whether the bug was even real or whether I was being gaslit. Their demo. Their code. They couldn’t get it running. Claude found 13 problems. I told it to fix all 13. I pointed out the gaslight comment to the room. They laughed nervously.

This was not an AI team. This was a team of legacy software engineers who got their hands on Claude Code a few months ago. Their working knowledge looked weeks deep (maybe days), not years.

There’s nothing wrong with being early. A lot of teams are there right now. The problem is they were sold to me as a consortium of AI experts. As the AI core of an organization claiming an AI native transformation. None of that was true.

While I waited between prompts for them to finish patching their environment, I started interviewing them. Are you running agents? Self-healing? Auto-refactoring? Dimensional tests on checkout flows? Personas? Evals? Running your own LLM? What’s your AI architecture? Have you tried React or other languages and benchmarked the token count and velocity differences?

The answers were always the same three flavors.

“How did you do that?”

“We haven’t done that yet.”

“We’re thinking about that.”

I heard those three responses, in some combination, to almost every question I asked.

In my head I wasn’t sure they had built anything in AI at all. They couldn’t even get their own demo running.

None of them pushed back on anything I said about AI architecture. None of them disagreed. None of them offered a counter-perspective. Real builders disagree. Real builders challenge. Real builders want to dig in. The room was full of nodders.

One exception. There was a third engineer in the room who actually listened, asked real questions about how I did things, and tracked the patterns I was describing. He was the only person in that office I’d consider working with again. Not because he had the answers. He didn’t. His answers were the same three: “we haven’t done that,” “we’re thinking about that,” “how did you do that?” But he was visibly trying to learn the right ones. That’s the trait you can build a real AI team around. The other two and the CTO didn’t have it.

I offered to walk them through the 368,000 lines of production code I’d shipped on this exact pattern. They didn’t want to look. I told them I’d written a research paper on the framework choice and published a companion piece on my blog, with data, no opinion. They didn’t want to read either. I’d built a sample app in Angular just to verify my thesis was current. They didn’t want to see it.

They wanted me to fix their broken upload demo, which they couldn’t run and couldn’t fix. As the CTO walked in to end the session, one of the engineers said, “we know what we need to know already from what you did.”

I hadn’t done anything. Just asked them a ton of questions while they tried to get their own code to work. Same responses again: “we haven’t done that,” “how did you do that,” “we’re thinking about that,” “looking into that.”

They debugged their own environment in front of me for an hour and never got it fully working. Wrong environment. Wrong keys. The list kept going. They never asked me a substantive question about my background. Nothing.

In my head I was thinking: why did they fly me out? We could have done this over Zoom. Or not done this at all. What were they actually searching for? A free consult at the cost of a plane ticket?

If that were me running an interview for a senior AI candidate I’d flown across the country, I’d be deeply embarrassed.

Then the CTO walked in

There’s something I’ve learned in twenty years of building teams. A team is a reflection of its leader. The team I’d just spent an hour with, the team that couldn’t run its own assessment, didn’t know the patterns it was supposed to evaluate, and treated a senior candidate as a tutorial, was the team this man had built.

Mid-test. Their environment still half-working. Me mid-prompt. The CTO walked in and announced he had ten minutes and we needed to talk now.

I pushed back, professionally. I haven’t gotten to ask any questions. I don’t know what the product is. I don’t know the scope of the role. I don’t know the values of the company. I don’t even know the title or the comp band. I didn’t get to finish what your team asked me to do. They’re still getting it set up and trying to get it to work. Can we actually have a conversation?

He brushed all of it off. “I have ten minutes. Come down the hall. Grab your stuff, you’re done here. You won’t be coming back to this office.”

Then he turned and walked off without me.

Didn’t wait. Didn’t slow down. Didn’t gesture to a room. Just left. Like the two minutes it would take me to grab my bag was time he didn’t have to spare. By the time I made it down the hall, he was already settled in the room.

Eight minutes

He gave me eight minutes. In reality it was less.

He asked what I thought. So I told him. Honestly. The way you’d want someone with my background to tell you, if you actually wanted to learn.

I told him their stack was going to be a problem if they were serious about going AI native, but we could make it work. I explained why, with data. The corpus is significantly smaller, which means more tokens per task, slower velocity, and a category of bugs the models can’t reason about cleanly. Their team’s inability to debug their own demo was a small live example of exactly that. (I didn’t say that out loud. I was thinking it in my head.) I told him I’d written a paper on the comparison and a companion blog post, no opinion, just data. I told him I’d built a sample app to verify the thesis before bringing it up. I told him if they wanted AI native, I could help. If they wanted bolt-on AI, fine, but here’s what that costs long-term.

I was polite. I wasn’t pushing. I was offering exactly the kind of strategic input you’d hope to get from someone you’d flown in.

He scowled. He said something to the effect of “who do you think you are to say that to me,” and “how would you like it if I said that to you,” in the snarkiest tone he could manage. I responded with something like, I would love to hear how we could do things better. I’d want to dig in and ask a ton of questions.

Remember: this CTO had just met me. He didn’t know what I’d built, what I knew, or anything about me. He didn’t know what questions I asked his team. He didn’t know what went on in that room before he walked in. He hadn’t asked.

Then he said “nice to meet you, Justin,” stood up, and walked me out through two locked doors. On the walk out, he asked me where I was from. I said the Minneapolis area. He told me he liked my Oakley bag. Then he shut the door behind me.

He had time for the parts that didn’t matter and no time for the parts that did. That tells you everything about what he actually values, and what he doesn’t.

Apparently he was more interested in where I was from and my Oakley bag than in the AI transformation he was supposedly leading. Those were the only questions he asked me. The polite reflex of someone closing a door he’d already decided to close before I sat down.

The water bottle

I’d been rushed so hard at the end, the CTO telling me to get up and grab my stuff right then, that I left my water bottle, my jacket, and my glasses behind in the small office.

One of the engineers called me as I was downstairs waiting for an Uber. He told me I’d left some things. I said I was still in the building, I’d come back up. When I went up two minutes later, I expected to see one of them. Nope. They just left my stuff on the counter.

Nobody came down to hand them to me. They left my belongings at a desk like I was a forgotten Amazon return. That was the temperature of the entire organization at that moment. Don’t come back up here. We’re done with you.

What this actually was

This wasn’t a bad interview. This was a fundamentally broken organization, top to bottom. The recruiter sold me a real role. What showed up wasn’t it. The team couldn’t run their own assessment. The CTO interrupted his own team’s process to deliver an eight-minute performance about how busy he was. The question I keep coming back to: do the CEO, the Chief People Officer, or even the board know this is how their organization treats senior candidates?

What I’d say to the CEO and the Chief People Officer

Your CTO interrupted his own team’s interview to deliver a power play. Your “consortium of AI experts” looked weeks deep (maybe days) on Claude Code, not years. Your principal engineer couldn’t find a candidate’s resume on the screening call and couldn’t get the team’s own assessment to run on the interview day. Your team didn’t ask the senior candidate a single substantive question about his background. Your team spent the session debugging their own demo while I answered their questions about my work. Your CTO scowled at honest stack feedback and walked the candidate out through two locked doors. Your team left his belongings at a front desk so they wouldn’t have to see him again.

Each of those is a data point. Together they’re a verdict on the technical leadership culture of this organization, and the verdict is not flattering.

If you’re serious about an AI native transformation, this is not the team that will get you there, and this is not the technical leadership that will attract anyone who can. You don’t need a better recruiter. You don’t need another ten candidates flown in. It might be time for a leadership change. Leadership comes from the top down, and the top is what I sat across from for eight minutes.

Staying the course with a culture where senior candidates are dismissed in eight-minute audiences after being flown in at company expense is not a viable path. The team will keep underperforming. The good candidates will keep walking. And the gap between what you’re claiming to be and what you actually are will keep growing until something breaks.

What I experienced was a toxic and incompetent environment. I’ve seen pockets of this before in my career. It can destroy a company. Remember: this is the team that, in two weeks, is going to lead the company’s AI-first initiative and build out their AI strategy. Let that sit for a moment.

I’m one person. This is one bad interview. But every bad interview is data, and every dismissed senior candidate is a signal you’re missing. If even a fraction of the candidates this year felt the way I felt, you have a problem that no amount of brand work or recruiter polish is going to solve. You have a technical leadership problem. And those don’t fix themselves.

The recruiter told me they needed to move fast. They’d been through a bunch of candidates who weren’t real AI builders. They wanted someone who’d been a founder, done zero-to-one, actually shipped. That was me. That’s my whole career.

I wonder about all those other candidates. Were they like me? Did they get the eight minutes?

Another perspective

1 / 4

What Larry Ellison Would Say

Orbyt Career Guides

“A CTO who can’t sit with a senior candidate for an hour is not a CTO. That’s a junior engineer with a title. The board should be embarrassed that the most expensive hire in the building can’t evaluate talent that flew in with 368K lines of code, 9,424 tests, and a research paper in its hand.”
“An Angular front end and an AI native pitch is a contradiction in engineering. The candidate said it once, calmly, and the CTO took it personally. The technically correct response is ‘tell me more.’ The technically wrong response is ‘who do you think you are.’ The company hired the wrong response.”
“The candidate did the company a favor by leaving. Now the candidate needs to do himself a favor by not joining a company where the talent ceiling is the CTO. The ceiling does not move.”

Voiced by Claude in the methodology of Larry Ellison’s public talks and Oracle earnings commentary. Not a transcribed quote.

Read the full piece on Orbyt Career Guides

What Steve Jobs Would Say

Orbyt Career Guides

“You don’t hire smart people to tell them what to do. You hire smart people so they can tell you what to do. The CTO in this room hired four engineers who can’t install dependencies on the day a senior candidate flies in. He hired exactly the team he could lead. He’ll keep hiring exactly that team until someone above him decides the team is the bottleneck.”
“The eight minutes is a kindness, by the way. He told you the truth. He showed you the team. He showed you what he values. You spent a flight learning what would have taken six months on the job. That’s a bargain.”
“The lesson is simple. Find leaders who get smaller egos when they meet sharper people, not bigger ones. The rest is detail.”

Voiced by Claude in the methodology of Steve Jobs’ public talks and the Isaacson biography. Not a transcribed quote.

Read the full piece on Orbyt Career Guides

What Elon Musk Would Say

Orbyt Career Guides

“If a CTO can’t spare twenty minutes after flying a senior person across the country, the CTO is the bottleneck. That’s the whole problem and the whole solution. Either he’s out, or the company is out, or you’re out. Pick one fast. Don’t do the slow thing.”
“AI native isn’t a slide. It’s an architecture. If the team can’t name three agents in production, three failure modes they’ve hit, and three things they had to rewrite from scratch, the company isn’t AI native. It’s using AI. There’s a difference and it’s not subtle.”
“Move on. The next interview is the only one that matters. Send the email. Don’t litigate the room. Litigation is a tax on momentum.”

Voiced by Claude in the methodology of Elon Musk’s public engineering talks and earnings calls. Not a transcribed quote.

Read the full piece on Orbyt Career Guides

What Sam Altman Would Say

Orbyt Career Guides

“Talent density is the only moat that compounds. A CTO who treats a senior candidate like an inconvenience has the wrong instinct about talent and probably about strategy. The best companies I’ve seen send the founder to coffee on a Saturday for someone they want. The eight minutes is a comp signal in disguise.”
“If a candidate hands you three hundred thousand lines of code and a research paper and you don’t look at any of it, you’ve disqualified yourself, not them. Curiosity is the trait. The absence of it is the disqualifier.”
“Take the data and run. The cost of one bad afternoon is one bad afternoon. The cost of joining a team that runs the way that interview ran is two years. The asymmetry is obvious. Move.”

Voiced by Claude in the methodology of Sam Altman’s public talks and Y Combinator essays. Not a transcribed quote.

Read the full piece on Orbyt Career Guides

Share this article

XLinkedIn
Justin Bartak, VP of AI and AI-native product leader

Justin Bartak

4x founder and VP of AI. $383M+ in enterprise value delivered across regulated fintech, tax, proptech, and CRM platforms. Recognized by Apple. Built Orbyt solo in 32 days with Claude Code. Founder of Purecraft.