
Nearly every engineering organization is experimenting with AI.
The 2025 DORA report on the AI Capabilities Model argues that this is no longer the interesting question.
The hard question is: does AI make your delivery system measurably better, or does it simply make you move faster in the wrong direction?
DORA’s answer is blunt: AI is an amplifier. It magnifies the strengths of well‑run engineering systems — and it magnifies the dysfunctions of weak ones. We found something very similar in our study on how DevEx aligns with AI-assisted development.
Let's summarize DORA’s core idea (the seven AI capabilities) and connect it to a practical operating rhythm from our DevEx guidebook: measure → act → learn, using both user and developer experience signals.
A lot of AI conversations still sound like procurement:
DORA pushes leadership in a different direction.
When nearly everyone has access to similar models, competitive advantage comes from the system around the model: how your organization makes decisions, how work flows, how safely knowledge can be accessed, and how quickly you can validate outcomes.
That system is what DORA calls AI capabilities.
DORA introduces seven foundational capabilities that determine whether AI adoption turns into sustained performance.
Teams move faster when they have clarity on what’s allowed and what’s expected. In practice this means:
AI quality depends on data quality. “Healthy” implies more than storage. It includes:
Many organizations have internal data but can’t use it safely. DORA highlights the need for secure pathways that let people and systems retrieve the right context without creating compliance or security hazards.
AI increases code throughput, but does not remove the need for:
AI makes it cheap to generate change. Small batches are how you keep that cheapness from turning into:
How DevEx Surveys help (practically): if you want to improve batch size, you need to see where the system breaks — not just the symptom.
A simple starting point is a one-question DevEx pulse:
If that signal is weak, you can run a short deep dive that pinpoints the failure mode across the workflow:
That combination turns “small batches” from a slogan into an actionable improvement plan.
DORA warns that if teams only see throughput dashboards (velocity, deployment frequency), they can lose sight of whether they are improving user outcomes. AI can increase output. User‑centricity ensures you’re increasing value.
DORA emphasizes that internal platforms are a primary way to scale AI benefits from individual productivity into system-wide improvements.
Importantly, the report ties platform effectiveness to the holistic developer experience (DevEx) — not just platform features — and recommends a product mindset (including PM ownership focused on DevEx).
DORA calls out its delivery performance metrics (speed and stability). These are essential.
But AI changes what gets optimized.
If leadership pressure pushes teams to maximize output metrics, AI can become an “activity accelerator” rather than an “outcome accelerator.”
A more robust measurement approach pairs:
This is also exactly how our DevEx blueprint frames effective delivery systems.
DORA gives leaders the what: build the capabilities that make AI net-positive.
Our playbooks focus on the how: operationalize improvement as a loop.
In The DevEx evolution: aligning delivery with direction, we frame software delivery as an experience-driven cycle that moves:
Both are executed through a measure → act → learn cadence.
A key leadership insight from the ebook: focus on a small set of top metrics (e.g., five, not fifty) so teams can actually align and act.
Our Poland-focused report, Polish DevEx: data, processes, AI, strongly echoes DORA:
It also frames DevEx as an early-warning system: surfacing issues before they show up in delivery outcomes.
If you want AI to improve outcomes (not just generate output), you don’t need a big-bang transformation.
You need to pick the right capability bottlenecks and install a learning loop.
Make ambiguity expensive by removing it.
Most teams stall because they cannot reliably retrieve the right context.
Invest in:
Small batches are the simplest guardrail that protects you from “fast wrong.”
Look for:
The cadence is the control system.
AI puts pressure on the parts of the system that were previously “invisible”: coordination costs, clarity, bottlenecks, and friction.
We help leadership connect experience signals (what people struggle with) to delivery performance signals (what the system is doing) — including the DORA delivery performance metrics as we calculate them:
Two offerings are designed to make those constraints measurable and improvable:
A practical way to use them with DORA’s model:
The value of DORA 2025 is its realism.
AI does not eliminate the need for strong engineering foundations.
It stress-tests them.
Organizations that treat AI as a capability upgrade — data, platforms, small batches, user focus, and a disciplined learning cadence — will turn AI into sustainable advantage.
Organizations that treat AI as a tooling rollout will often get the opposite: more output, more noise, and faster drift.
If you want help establishing a baseline and prioritizing improvements, start with DevEx Surveys and build your first measure–act–learn cycle from real signals.