Business Acceptance Testing Analysts and Scrum Teams

Business Acceptance Testing Analysts
and Scrum Teams

Who actually owns BAT – and why getting that wrong is the most expensive mistake your team makes every sprint.

68%
of Agile failures trace back to unclear acceptance criteria

cheaper to catch a defect in BAT than in production
41%
of sprints over-run because BAT ownership is undefined
2.5×
faster release cycles when BA actively joins sprint ceremonies

 

“When a BA disappears after writing requirements, every other role on the scrum team quietly starts guessing. And when everyone guesses, the thing that gets built is rarely the thing the business actually needed.”

Nobody Agrees Who Owns BAT – And That’s Burning Your Sprints

Business Acceptance Testing – BAT – sits at one of the messiest intersections in modern software delivery. It’s the final gate between what the team built and what the business signed off on. And in most scrum teams, the question of who actually owns that gate gets answered differently by every single stakeholder in the room.

The Product Owner thinks the BA handles it. The BA thinks QA runs the scripts. QA assumes the business users validate. And the developers? They shipped to “done” two days ago and have already moved on to the next sprint. Meanwhile, the acceptance testing phase sits in a no-man’s-land that quietly blows past every deadline.

This post fixes that. We’re going to break down what BAT actually means in a scrum context, define exactly what each role is responsible for, show you where the handoffs break down with real examples, and give you the frameworks – straight from BABOK and SAFe – to run this cleanly regardless of your team size or org structure.

If you’re a BA, a QA lead, a PO, or a senior dev who’s tired of the same release-week fire drills, this is for you.


What Business Acceptance Testing Actually Is

Let’s be precise. BAT is not the same as User Acceptance Testing (UAT), even though those terms get used interchangeably at half the companies in the industry.

UAT is end-user-driven testing performed by actual business users to validate that the solution works in their real-world context.

BAT is the broader validation that the delivered software meets the business requirements – the ones captured by the BA, prioritized by the PO, and accepted by the business stakeholders. It sits above functional QA testing and below full production release.

Testing PhaseWho Drives ItWhat It ValidatesWhen It HappensTool / Artifact
Unit TestingDeveloperIndividual code componentsDuring developmentJest, JUnit, NUnit
Integration TestingDev / QAComponent interactionsPost-dev, pre-QAPostman, REST Assured
System TestingQA TeamFull application behaviorQA sprintSelenium, Cypress
Regression TestingQA TeamNo new breaks in existing featuresEach sprint releaseTest automation suites
Business Acceptance Testing (BAT)BA + PO + BusinessBusiness requirements metPre-production releaseAcceptance criteria, BDD scenarios
UATBusiness UsersReal-world usabilityStaging / pre-launchUser scripts, feedback sessions

Notice where BAT sits: it’s the bridge between what QA verified technically and what the business actually approved conceptually. That’s what makes it different – and that’s what makes it the most relationship-intensive testing phase in the entire Software Testing Life Cycle (STLC).

In a SAFe context, BAT maps to the System Demo and Inspect & Adapt events. In a standard scrum framework, it’s the work that happens between the sprint review and the formal release sign-off. Either way, it requires a specific set of roles to be clear, accountable, and communicating well.


The Four Roles That Make or Break BAT

Here’s the infographic-style breakdown. Each of these roles has a specific lane in the BAT process – and when any one of them drifts out of their lane, the whole thing falls apart.

Business Analyst

  • Writes acceptance criteria
  • Authors BDD scenarios (Given/When/Then)
  • Bridges business and technical teams
  • Reviews test cases for business alignment
  • Participates in defect triage
  • Signs off that requirements are met

Product Owner

  • Owns and prioritizes the backlog
  • Defines “done” at story level
  • Accepts or rejects sprint output
  • Manages stakeholder expectations
  • Makes go/no-go release decisions
  • Coordinates with BA on business value

QA / Test Analyst

  • Creates & executes test plans
  • Automates regression coverage
  • Logs and tracks defects
  • Verifies acceptance criteria technically
  • Supports BA in BAT script review
  • Owns test environment health

Developer

  • Builds to acceptance criteria
  • Writes unit & integration tests
  • Fixes defects raised in BAT
  • Participates in 3 Amigos sessions
  • Documents technical context
  • Supports environment setup

For a deeper look at each role individually, see the full guides on what a Business Analyst does, the Product Owner role, and what QA actually covers in modern software delivery.


BA vs. PO vs. QA vs. Dev: Who Does What in BAT

The table below is the one you want to pin on your team’s Confluence page. It’s built from BABOK v3 role definitions cross-referenced with SAFe’s team-level guidance. Every column maps to a real BAT activity.

BAT ActivityBAProduct OwnerQA AnalystDeveloper
Write acceptance criteriaLeadsReviews & approvesConsultsConsults
Define “done” for storiesCo-definesOwnsValidatesImplements against
Create BAT test scriptsAuthorsReviewsEnhances technicallyConsults on logic
Execute BAT scenariosLeads or supportsObserves / participatesSupports executionOn-call for fixes
Log and triage defectsBusiness severityPrioritizes backlogTechnical severityEstimates & fixes
Stakeholder sign-offFacilitatesObtains formallyProvides evidence
Go/no-go release decisionAdvisesDecidesProvides test resultsAdvises on risk
Regression sign-offApproves scopeOwnsSupports
3 Amigos / Story RefinementLeads discussionBusiness priority contextTesting perspectiveTechnical feasibility
Sprint Demo participationNarrates business valueAccepts/rejectsDemonstrates coverageDemonstrates feature
The RACI trap: Most teams assign BA as “Responsible” for everything in BAT and forget to assign “Accountable.” In BABOK terms, the BA is responsible for eliciting and documenting requirements that feed acceptance criteria. The PO is accountable for the business decision on those criteria. Mixing those two is where most BAT ownership debates start.

The Acceptance Testing Analyst: More Than a Requirement Writer

73%
of defects in BAT are rooted in ambiguous acceptance criteria

Let’s be direct: in most scrum teams, the Business Analyst is doing the heaviest intellectual lift in the BAT phase, and getting the least formal credit for it. Here’s what that role actually looks like when it’s done right.

Before the Sprint Starts

A BA who’s doing their job in Agile doesn’t wait for the sprint to start writing acceptance criteria. By the time a story hits sprint planning, the BA should have already facilitated at least one 3 Amigos session – the collaboration ritual where BA, QA, and dev sit down together to stress-test a story before a single line of code is written.

In that session, the BA walks through the business requirement, QA challenges the edge cases (“what happens if the user submits the form twice?”), and the developer flags technical constraints. What comes out of that meeting is a set of Given/When/Then scenarios (BDD-style) that become the backbone of the BAT test scripts later.

Live Example – Healthcare Claims Portal

A BA at a large health plan is working on a member portal story: “As a member, I want to view my EOB documents so I can track my claims.” During the 3 Amigos session, QA asks: what if the member has no EOBs on file? The dev mentions that the API returns a 204 (empty) vs. a 404 (not found) for those two scenarios. The BA writes two separate acceptance criteria that result in different UI states – both of which now have explicit BAT test cases. Without that session, both edge cases likely would have been caught in production by a frustrated member.

During the Sprint

Mid-sprint, the BA’s BAT responsibilities shift to reviewing QA test cases for business alignment. It’s a common mistake to assume QA’s test cases are automatically aligned with business intent. They’re not – they’re aligned with technical behavior. Those are different things.

A QA analyst verifies that clicking “Submit” fires a POST request and returns a 200 status. A BA verifies that submitting that form triggers the downstream business process – the notification email, the record update in the CRM, the audit log entry – and that all of those things match what the business stakeholder agreed to six weeks ago in the requirements workshop.

At Sprint End – The BAT Execution Phase

This is where things get operationally messy. The BA should be actively involved in BAT execution – not as a passive observer, but as a co-pilot with QA. The BA brings business context to each test case. When a defect surfaces, it’s the BA who determines business impact. Is this a P1 that blocks the release? Or is it a cosmetic issue the business can live with for one sprint?

The BA also owns the conversation with business stakeholders during BAT – walking them through what was tested, what passed, what failed, and what the recommended go/no-go position is. That’s not a QA job. That’s not a PO job. That belongs to the BA.


The 7 Most Common BAT Failure Patterns in Scrum Teams

These aren’t theoretical edge cases. These are the patterns you’ve probably lived through at least twice in the last year.

#Failure PatternRoot CauseWho Feels the PainFix
1BA exits after requirements handoffWaterfall habits in Agile clothingQA, PO, DevBA embedded in sprints end-to-end
2Acceptance criteria written post-developmentNo story refinement disciplineEveryoneMandate 3 Amigos before sprint entry
3PO accepts story without BAT evidenceSprint pressure + trust assumptionBusiness stakeholdersDefinition of Done includes BAT sign-off
4QA and BA have separate defect definitionsNo shared severity modelDev, PMJoint defect triage sessions
5Business users run BAT without BA guidanceSiloed UAT / BAT processBA, POBA facilitates all BAT sessions
6No formal BAT environmentInfrastructure gapsQA, BADedicated staging environment in DoD
7Requirements changed mid-sprint without BA updateScope creep / informal PO-dev side channelsBA, QAChange request protocol in team norms

Failure pattern #1 is the most structurally damaging. When a BA operates in a handoff model – write requirements, hand to dev, disappear – the scrum team loses the one person who can translate business intent into testing decisions in real time. The result is that QA makes judgment calls on business edge cases they’re not qualified to make, and defects that should have been caught in BAT surface in production.


The BAT Process Flow Inside a Scrum Sprint

BAT Execution Flow Within a 2-Week Sprint

Sprint Planning
BA reviews stories, confirms acceptance criteria are complete and testable
3 Amigos
BA + QA + Dev align on edge cases, write BDD scenarios, flag risks
Development
Dev builds to acceptance criteria; BA available for clarification
QA Testing
QA executes functional & regression tests; BA reviews test case coverage
BAT Execution
BA leads BAT, runs business scenarios, logs business-severity defects
Defect Triage
BA + PO determine go/no-go based on open defects; critical defects loop back to dev
Sprint Demo
BA presents business outcomes; PO accepts stories; stakeholders review
Release / Retro
PO signs off release; BA captures lessons for next sprint refinement

This flow aligns with both the BABOK v3 Agile Extension guidance on continuous requirements and SAFe’s team-level iteration execution model. The key structural point: BAT is not a phase that happens after QA. It runs in parallel with, and informed by, QA’s technical validation.


How Each Role Participates in Scrum Ceremonies

One of the clearest ways to see role misalignment is to look at who’s actually showing up to which ceremonies – and what they’re contributing when they do. Here’s the breakdown for teams running standard Scrum framework sprints.

CeremonyBA ContributionPO ContributionQA ContributionDev Contribution
Sprint PlanningClarifies requirements; confirms ACs are testablePresents priority; sets sprint goalFlags testing concerns; estimates QA effortCommits to stories; estimates complexity
Daily StandupFlags requirement blockers; answers clarification questionsMonitors progress; unblocks business decisionsReports test status; flags environment issuesReports dev progress; raises technical blockers
Backlog RefinementWrites/refines ACs; facilitates 3 AmigosPrioritizes; sets business valueReviews testability of storiesReviews technical feasibility
Sprint Review / DemoNarrates business value of each featureFormally accepts/rejects storiesReports test coverage and quality metricsDemonstrates built functionality
Sprint RetrospectiveReflects on requirements quality; ACs gapsReflects on backlog hygieneReflects on test coverage; defect trendsReflects on technical debt; dev practices
Live Example – Insurance Claims Automation

At an insurance carrier running 2-week sprints, the BA noticed that QA was logging defects against acceptance criteria that had already been superseded by mid-sprint stakeholder conversations – conversations the BA wasn’t in because she’d been cut from the daily standups as a “non-developer.” The team reinstated her attendance, and within two sprints, the defect rework rate in BAT dropped by 34% because blockers were being caught in standup before they became QA defects.


Writing Acceptance Criteria That Actually Support BAT

This is where the rubber meets the road. Most BAs write acceptance criteria that are either too vague to test or too technical to make business sense. Here’s what good looks like versus what you’re probably living with.

Quality DimensionBad AC ExampleGood AC Example
Specificity“The system should display an error when login fails.”“Given a user enters an incorrect password 3 consecutive times, the account is locked and an email is sent to the registered address within 2 minutes.”
Business alignment“The API should return a 400 status for bad requests.”“Given a member submits a claim with a missing NPI code, the system rejects the claim and displays: ‘Provider NPI is required. Please contact your provider.'”
Edge case coverage“Users can upload documents.”“Given a user uploads a PDF over 10MB, the system displays an inline error. Given a user uploads an accepted file type under 10MB, the upload succeeds and a confirmation is shown.”
Testability“The page should load fast.”“The dashboard loads within 3 seconds on a standard broadband connection. Load time is measured from initial request to full page render.”

In BABOK v3 terms, good acceptance criteria meet the SMART standard: Specific, Measurable, Achievable, Relevant, and Time-bound. In BDD terms, every criterion maps to at least one Given/When/Then scenario. If you can’t write a scenario for it, your AC isn’t ready to enter a sprint.


Where BAT Fits in the Bigger SDLC Picture

BAT doesn’t exist in isolation. It’s one stage in the Software Development Life Cycle (SDLC), and understanding how it connects to upstream and downstream activities is what separates a mid-level BA from a senior one.

The quality of BAT is almost entirely determined by decisions made earlier in the SDLC. If requirements are ambiguous, BAT surfaces defects that are actually requirement gaps. If stories aren’t refined before sprint entry, BAT becomes a re-discovery session instead of a validation session. If the PO hasn’t aligned stakeholders on acceptance standards before the sprint, BAT becomes a negotiation instead of a confirmation.

This is the systemic view: BAT is a lagging indicator of requirements quality and story refinement discipline. Teams with strong upfront BA practices – thorough elicitation, well-structured 3 Amigos sessions, BDD-style acceptance criteria – consistently have shorter, cleaner BAT cycles.

It also connects directly to your overall testing strategy. BAT doesn’t replace any of the other test types – it completes them. Functional QA testing verifies the system works. BAT verifies the system works for the business.


BAT in SAFe: How It Scales Beyond a Single Scrum Team

For organizations running SAFe (Scaled Agile Framework), BAT gets more complex – but also more structured. At the team level, BAT works as described above. But at the Program Increment (PI) level, there are additional BAT considerations that most implementations miss.

SAFe LevelBAT ActivityWho Owns ItCadence
Team LevelStory-level BAT, sprint review acceptanceBA + POEach sprint
Program LevelSystem Demo, feature-level acceptanceProduct Management + BAsEach PI iteration
Solution LevelSolution Demo, cross-team integration testingSolution Architect + BAsPI boundary
Portfolio LevelEpic acceptance, strategic outcome validationBusiness Owners + Product ManagementQuarterly / PI

In SAFe, the BA role often maps to the System Team or sits within the Agile Release Train (ART). In either case, the BA’s BAT responsibilities scale up: they’re no longer just validating individual stories – they’re ensuring that assembled features at the program level still meet the original business epics they were derived from.


The BAT Starter Kit: What Your Team Needs to Run This Right

Here’s the minimum viable BAT toolkit for a scrum team of 6-10 people. These aren’t nice-to-haves – they’re the infrastructure that separates teams that ship clean from teams that ship and pray.

1. Acceptance Criteria Template

Story-Level BAT Acceptance Criteria Template

Use this structure for every user story that enters sprint planning.

Story: [User story statement]
Business Context: [Why this matters to the business / which process it supports]Acceptance Criteria:
AC-01: Given [context], When [action], Then [expected outcome]
AC-02: Given [edge case], When [action], Then [expected outcome]
AC-03: Given [negative path], When [action], Then [expected outcome]

Out of Scope: [What is explicitly NOT being validated in this story]
Dependencies: [Upstream systems, APIs, or stories that must be complete]
BA Sign-off Required: Yes / No
Stakeholder Sign-off Required: Yes / No

2. BAT Execution Checklist

  • All acceptance criteria reviewed and approved before sprint entry
  • 3 Amigos session completed for all complex stories
  • BAT environment set up and accessible before sprint midpoint
  • Test data prepared and validated by BA prior to BAT execution
  • QA test cases reviewed by BA for business coverage completeness
  • Defect severity matrix agreed upon by BA + PO + QA lead
  • BAT execution completed at least 2 days before sprint end
  • Critical defects triaged and resolved before sprint demo
  • BA sign-off documented and attached to each story
  • Go/no-go decision made by PO with BA recommendation

3. Defect Severity Matrix for BAT

SeverityDefinitionBAT ImpactDecision
P1 – CriticalCore business process completely brokenBlocks releaseFix before demo; no exceptions
P2 – HighMajor business function impaired; workaround existsBlocks story acceptanceFix in current sprint or downgrade story
P3 – MediumBusiness function works; edge case failsAccepted with conditionsLog as tech debt; schedule fix next sprint
P4 – LowCosmetic; no business impactDoes not blockLog in backlog; address in future sprint

How to Measure BAT Health on Your Team

You can’t improve what you’re not measuring. These are the six BAT metrics that give you a real picture of process health – not vanity metrics, but the ones that actually predict release quality.

MetricWhat It MeasuresHealthy BenchmarkWarning Signal
AC Completeness Rate% of stories with full ACs at sprint planning> 90%< 75% = refinement problem
BAT Defect Escape RateDefects found post-BAT in UAT or production< 5%> 15% = BAT coverage gap
BAT Cycle TimeDays from BAT start to sign-off< 30% of sprint duration> 50% = execution or environment problem
Defect Rework Rate% of defects reopened after first fix< 10%> 25% = AC or dev handoff problem
BA Participation Rate% of BAT sessions with active BA involvement100%< 80% = structural role gap
Go/No-Go Confidence ScoreStakeholder confidence in PO’s release decision (1-5)> 4.0< 3.0 = trust/communication breakdown

Track these sprint-over-sprint. If you’re seeing degradation in BAT defect escape rate and BA participation rate dropping simultaneously, you almost certainly have a structural issue – the BA is being pulled out of sprint ceremonies, and defects are slipping through as a direct result.


What Separates Mid-Level BAs from Senior BAT Practitioners

For the senior practitioners reading this: the mechanics above are table stakes. Where senior BAs actually differentiate themselves is in the following areas that most teams don’t have explicit processes for.

Anticipatory Requirements Analysis

A senior BA doesn’t just write acceptance criteria for what was asked. They write criteria for what the business will eventually ask – the edge cases that haven’t surfaced yet, the regulatory requirements that are two quarters away, the integration dependencies that haven’t been formally scoped. That proactive coverage is what makes BAT faster: the tests are already written for scenarios that mid-level BAs would discover mid-sprint.

Defect Advocacy

Senior BAs advocate for defect prioritization with business impact data, not just technical severity. “This is a P2 because it affects the month-end close process that 200 finance users run on the last business day of every month” is a completely different conversation than “this is a P2 because the error message is missing.” The former gets a dev assigned immediately. The latter gets debated.

Stakeholder Expectation Management

Senior BAs use the BAT process as a stakeholder management tool. They bring key business users into BAT sessions not just to validate the system but to pre-build acceptance before the formal sign-off. When stakeholders have seen a feature tested thoroughly against their own requirements, the go/no-go conversation in the sprint demo is a confirmation – not a surprise.

The senior BA litmus test: Can you walk into a stakeholder meeting, present the BAT results for a sprint, and make the release recommendation with enough business context to be challenged on it – and hold your ground? If yes, you’re operating at senior level. If you’re still presenting test pass/fail percentages and calling it done, there’s a gap.

The Bottom Line: Own the Gate

Business Acceptance Testing is not a bureaucratic checkpoint. It’s the moment where everything the BA elicited, the PO prioritized, the dev built, and the QA verified either proves it was worth the sprint – or exposes that somewhere in the chain, the signal got lost.

The teams that run BAT well have one thing in common: they’ve been honest about who owns what. The BA owns requirements quality and BAT facilitation. The PO owns the release decision. QA owns technical test coverage. Developers own code quality and fast defect resolution. When those lanes are clear, BAT moves fast, stakeholders trust the output, and the sprint demo feels like a showcase instead of a negotiation.

The teams that struggle with BAT have usually built a system where everyone defers to someone else when the hard questions come up. The defect sits untriaged because the BA isn’t sure if it’s her call. The story gets accepted without BAT evidence because the PO is under sprint pressure. The stakeholder signs off on something she didn’t fully review because the BA didn’t bring her into the room early enough.

Fix the roles. Fix the process. Own the gate.

For the full context on each role, tool, and framework referenced here, explore the guides on Business Analysis, Product Ownership, Quality Assurance, and the full Software Testing Life Cycle breakdown at TechFitFlow.

 

 

Scroll to Top