AI Didn’t Fix Anything. It Amplified Everything.
A few months ago, an engineering leader told me:
“We rolled out AI tools to the whole team. Productivity is up. But delivery…isn’t.”
That sentence captures where a lot of teams are right now.
The rollout usually looks the same: excitement, some pushback, experimentation, a few early wins.
Then reality sets in: merge frequency goes up, PRs get bigger. Review time increases. Incidents tick up slightly. More late night and over the weekend code commits. Nothing catastrophic. Just friction.
Here’s what makes it disorienting: the metrics that were supposed to confirm success - adoption rates, lines of code, velocity, all look pretty good. But something feels off. Delivery isn’t improving the way leadership expected.
And thats when the uncomfortable realization hits: AI didn't “fix" anything. It amplified everything.

AI is Leverage, Not Magic
There’s a quiet assumption floating around that AI tools automatically improve engineering productivity.
They don’t.
AI is leverage. And leverage multiplies whatever system you already have.
Strong engineering systems - clear architecture, disciplined reviews, good testing, fast feedback loops - tend to get stronger. Weak systems don’t get fixed. They get stressed.
That’s why some teams see measurable gains, while others see noise. The difference isn’t the tool. It’s the fundamentals.
Fundamental #1: Codebase Health
AI coding tools are excellent at generating code. They are not good at understanding the long-term intent of a system.
If your codebase is poorly documented, tightly coupled, or inconsistent, AI will confidently add more of the same - and faster.
One of the earliest signals is PR size. AI-generated code tends to be verbose. Larger PRs slow reviews and increase defects.
Leadership action: Set a clear expectation for small, reviewable PRs.
Not as a hard rule, but as a norm. What leaders accept becomes the standard.

Fundamental #2: Developer Experience
Some teams think AI will fix developer friction. In reality, it exposes it.
If your CI takes 20 minutes, AI doesn't help much.
If onboarding takes weeks, AI doesn't solve that.
If workflows are brittle, AI just adds more volume to the bottleneck.
Teams with fast pipelines and clear workflows see AI amplify momentum.
Leadership action: Improve pipeline feedback speed before expanding AI tooling. Speeding up CI/CD often unlocks more real productivity than adding another AI tool.
Fundamental #3: Engineering Culture
When AI enters a team, culture determines how it’s used.
Do you have strong code review norms?
Are large PRs pushed back on?
Are engineers comfortable questioning AI output?
On one team, leaders made it clear that they wouldn’t review massive AI-generated PRs. Engineers adapted. They used AI to write tighter, cleaner changes.
On another team, large AI-generated PRs were accepted without much pushback. Review fatigue increased. Bugs crept in slowly over time.
Same tools. Different outcomes.
Leadership action: Set the expectation that engineers own their AI output, fully. Whatever ships is theirs. Engineers who use AI well move faster and ship cleaner work. That’s the bar.

Fundamental #4: Measurement Discipline
If you’re not measuring your system before AI, you won't understand it after.
Adoption metrics are relatively easy to find. Velocity metrics feel exciting. But pair them with quality metrics to get the complete picture. (I covered the exact metrics to track and how to use them in a previous issue.)
You don't want:
Faster code + higher change failure rate
More merges + longer MTTR
Higher throughput + rising burnout
Leadership action: Pick one metric per tier and track it consistently for 30 days:
Adoption -> active weekly usage
Velocity → merge frequency or cycle time
Quality → MTTR or change failure rate
Clarity shows up faster than you expect.
Fundamental #5: Product Clarity
AI accelerates building. But if product direction is fuzzy, priorities shift weekly, or engineers lack context, you just build the wrong thing faster - and AI makes that more expensive, not less.
High-performing teams pair AI with clear success criteria: a defined outcome, an owner, and a way to measure it. Without that, speed becomes noise.
AI becomes an execution accelerator.
Leadership action: Ensure every project has a clear definition of done. Not just tickets - outcomes. What success looks like. Who owns it. How it’s measured.
The Pattern That Keeps Repeating
The teams benefiting most from AI aren’t the ones with the newest tools. They’re the ones who had strong fundamentals before the tools arrived.
Clean codebases. Fast feedback loops. Strong review norms. Clear product direction. Disciplined measurement.
AI doesn't create these things. It reveals whether they exist.

Leadership Action Item of the Week
Before your next AI tooling decision, ask yourself: What would happen if this team moved twice as fast tomorrow?
If the answer is “we’d ship better outcomes - great. Keep going.
If the answer is “we’d accumulate debt faster” or “we'd build the wrong things more quickly” - that’s your signal. Fix the system before amplifying it.
The teams that get this right aren’t waiting for better tools. They’re investing in the foundations that make tools worth having.
What’s Next?
Evolution of our Roles - EPD & EMs
Setting Clear Expectations in Growing Teams
How to Scale Without Burning Out Your ICs
Velocity vs Durability: Pick Both
Want something covered? Hit reply and tell me. I love hearing what you’re dealing with.
Work With Me
Resume Review
A detailed review of your resume with specific, actionable feedback to strengthen your story, highlight impact, and position you for Engineering IC or Leadership roles.
Mock Interviews
A practice session tailored to Engineering IC or Leadership roles. You’ll get structured feedback, real scenarios, and clarity on what interviewers actually look for.
1:1 Mentorship
A session focused on your career growth, navigating leadership challenges and building a roadmap toward your next role.
📬 Reply back to this email to book a 30 min session (free for subscribers!)
Meme of The Week

Senior engineers explaining pre-AI life 😃
That’s a wrap for this week’s issue of CodingBeenz! 👩💻
Keep investing in the fundamentals, keep asking the hard questions, and remember: the tools amplify everything. Make sure there’s something worth amplifying!
Until next time,
Sabeen
P.S.
The Cherny “software engineers are going away” 🤖 take is making rounds this week. Change is coming and the leaders who lean in, adapt, and bring their teams along will come out stronger 🚀. Next week we’re diving into how EPD roles are evolving and what that means for how we work together. 👀 ☕️


