Acceptance Criteria in Jira: Free-Form Format for Stories and Defects Across Real Scenarios
The Given/When/Then format is well-documented. What isn’t is what to do when it doesn’t fit – when your Jira ticket is a defect fix, a system-level constraint, a data validation rule, or a compliance requirement that no BDD template handles cleanly. This article shows how to write acceptance criteria in Jira using a free-form, rule-based approach that works for stories and defects across different project types, team compositions, and compliance environments – with examples you can adapt directly into your own tickets.
What Acceptance Criteria in Jira Actually Need to Do
Acceptance criteria (AC) define the conditions a Jira issue must meet before the team marks it done. They are not a story summary. They are not a wish list. They are pass/fail statements that a QA analyst can execute against, a developer can build to, and a Product Owner can sign off on – independently, without a meeting. If your acceptance criteria require a verbal explanation to be understood, they haven’t done their job.
BABOK v3 frames acceptance criteria as part of solution evaluation – conditions the solution must satisfy to be accepted by stakeholders. ISTQB goes further, requiring that test cases be derived directly from acceptance criteria. Both frameworks treat AC as the boundary between “built” and “done.” That boundary is where most sprint failures originate.
In Jira, acceptance criteria live in different places depending on how the project is configured: the story description field, a dedicated custom paragraph field, a checklist plugin (Smart Checklist, Issue Checklist for Jira), or an embedded table in a Confluence-linked document. None of these locations matters more than the content. Criteria that are clear, testable, and complete work in any field. Criteria that are vague fail regardless of where they live.
Why Given/When/Then Is Not Always the Right Tool
The Given/When/Then format – inherited from Behavior Driven Development (BDD) and Gherkin – works well for user-initiated workflows with a clear precondition, action, and expected outcome. A login flow, a search function, a form submission – these map naturally to GWT.
It breaks down in several common situations. System-level behaviors without a user action don’t fit the “When [user does X]” pattern. Defect fix verification criteria aren’t behavioral scenarios – they’re specific confirmations that a previously broken thing now works correctly. Non-functional requirements (performance, security, data retention) don’t map to GWT at all. Data validation rules with multiple conditions become GWT chains so long they’re unreadable.
AltexSoft’s analysis of acceptance criteria formats identifies the rule-oriented format as the appropriate alternative when GWT scenarios don’t fit. Rule-oriented AC states what the system must do – not as a scenario, but as a testable condition. It’s direct, readable, and adaptable to any issue type.
| Format | Best For | Not Suitable For | Automation-Ready |
|---|---|---|---|
| Given/When/Then (BDD) | User-initiated workflows, UI interactions, login/checkout flows | Defect fixes, system rules, non-functional requirements, data constraints | Yes – Cucumber/Gherkin |
| Free-Form / Rule-Based | Defects, data rules, system constraints, compliance requirements, API behavior | BDD automation pipelines that require Gherkin parsing | No – requires test case translation |
| Checklist (pass/fail items) | Definition of Done, multi-condition stories, QA sign-off tracking in Jira | Complex conditional behavior with multiple interdependencies | Partial – via Jira automation on checklist completion |
| Hybrid | Mixed stories with both user-flow and system-rule criteria | Teams that require strict format consistency for tooling | Partial |
The Anatomy of Free-Form Acceptance Criteria in Jira
Free-form acceptance criteria don’t follow a rigid template. They follow a discipline. Each criterion is a complete, testable statement. It names the subject (the system, the field, the user, the API), describes the expected behavior, and specifies any relevant condition or constraint. It passes or fails. There’s no ambiguity about what “done” looks like.
The structure that works in practice is a short header (the criterion in one line) followed by an optional context note or constraint. Each criterion covers one thing. If you find yourself using “and” to join two outcomes in a single criterion, split it. Two conditions in one criterion make it impossible to fail just one of them cleanly – which means your QA result won’t be precise.
Karl Wiegers in Software Requirements, 3rd Edition identifies atomicity as a core quality attribute of good requirements: each statement should express one idea. That principle applies directly to acceptance criteria. An atomic criterion can be tested, traced, and closed independently.
Context note (optional): constraint, data type, range, source system, or edge case clarification.
[AC-3] The claims submission API returns HTTP 422 with error code ERR_ICD10_MISSING when the ICD-10 diagnosis code field is empty.
Context: Applies to all POST /claims/submit requests. Validated in the QA environment against the payer integration test harness.
Numbering criteria ([AC-1], [AC-2]) is not mandatory but is strongly recommended in Jira. Numbered criteria let QA reference specific failures in defect reports without copying text. “Failed on AC-4” is a precise handoff. “The third bullet didn’t work” is not.
Acceptance Criteria in Jira for User Stories: Free-Form Examples Across Scenarios
The following examples cover different story types that appear on IT programs. Each shows what poor criteria look like versus what testable, free-form criteria look like – without using Given/When/Then.
Story Type 1: User Access and Role-Based Permissions
Role-based access control stories appear in virtually every enterprise implementation. They’re system-level rules, not user workflows. GWT is clumsy here because there is no single user action – there are multiple role conditions that must be enforced system-wide.
Users should only see what they are allowed to see based on their role.
[AC-1] A user with the Nurse role can view patient demographic records but cannot edit or delete them.
[AC-2] A user with the Billing Specialist role can access the Claims Submission module. Users without this role receive an “Access Denied” message and are redirected to the dashboard.
[AC-3] A user with the Physician role can create, edit, and sign clinical notes. The system requires an electronic signature before a note can be finalized.
[AC-4] Role changes applied in the admin console take effect within one login session refresh. No restart is required.
[AC-5] Any attempt to access a restricted module by an unauthorized role is logged in the audit trail with timestamp, user ID, and the module attempted.
Note [AC-5] directly addresses a HIPAA audit requirement. The HIPAA Security Rule (45 CFR §164.312(b)) requires audit controls that record access to ePHI. That criterion belongs in the story’s AC, not in a separate compliance ticket that nobody links back to the feature.
Story Type 2: Data Validation and Field-Level Rules
Data validation stories are about rules, not flows. A form field that must accept only specific formats, a date range that cannot be in the future, a code that must match a reference table – none of these fit naturally into GWT. Free-form rule-based AC is exactly right here.
The ICD-10 code field should be validated properly.
[AC-1] The ICD-10 code field accepts values in the format [A-Z][0-9][0-9].[0-9]{0,4}. Invalid formats are rejected at the field level before form submission.
[AC-2] The system validates the submitted ICD-10 code against the current CMS ICD-10-CM code table. Codes not present in the table trigger an inline error: “Invalid ICD-10 code. Please verify and resubmit.”
[AC-3] The ICD-10 field is mandatory for all claim types except claim type 14 (Adjustment). Submitting without a code for any other claim type blocks submission and displays: “ICD-10 code is required for this claim type.”
[AC-4] The field accepts up to 8 characters including the decimal point. Entries exceeding 8 characters are truncated at the field level and a tooltip displays the character limit.
[AC-5] The validated ICD-10 code is stored in the database as a VARCHAR(10) field with no trailing spaces.
[AC-5] is often skipped because it looks like a developer concern, not a business requirement. It isn’t. Trailing spaces in stored ICD-10 codes cause downstream matching failures in payer systems – a defect that appears weeks after go-live, during reconciliation, not during QA. Writing it into the AC prevents it.
Story Type 3: API Integration and System-to-System Behavior
API integration stories are almost never about a user action. They describe what one system sends, what another system does with it, and what both systems do when things go wrong. GWT requires a “user” as the subject of “When” – but when the actor is a message queue or an upstream API call, that structure breaks.
The system should handle HL7 messages correctly.
[AC-1] The inbound HL7 FHIR R4 DiagnosticReport resource is parsed and stored in the EHR’s lab results module within 5 seconds of receipt from the LIS interface engine.
[AC-2] If the DiagnosticReport resource is missing the subject reference (patient ID), the interface engine rejects the message and writes an error entry to the HL7 error log with message ID, timestamp, and error code HL7-ERR-204.
[AC-3] Successfully processed DiagnosticReport messages return HTTP 200 to the sending system. Failed messages return HTTP 422 with a structured error body containing the rejection reason code.
[AC-4] The system handles a message volume of up to 500 DiagnosticReport transactions per hour without degradation in response time beyond the 5-second SLA.
[AC-5] Duplicate message detection is applied using the Bundle.identifier field. Duplicate submissions within a 24-hour window are rejected with error code HL7-ERR-409 and logged without reprocessing.
These criteria are directly executable as API test cases in Postman or REST-assured. A QA engineer reading this story knows exactly what to test, what payloads to construct, and what HTTP responses to assert against. That’s the goal.
Story Type 4: Reporting and Data Output
Report and data export stories involve output format, data accuracy, filtering, and access – none of which map cleanly to GWT scenarios. What the report contains, what it excludes, how it’s sorted, and what format it generates in are the AC.
The claims report should show accurate data for the billing team.
[AC-1] The Monthly Claims Summary report displays all claims submitted within the selected calendar month, inclusive of the start and end dates.
[AC-2] The report includes the following columns in this order: Claim ID, Patient Name, Date of Service, ICD-10 Code, Billed Amount, Payer Name, Status, and Paid Amount.
[AC-3] Claims with Status = “Denied” are displayed with a red indicator in the Status column. Claims with Status = “Pending” display an amber indicator.
[AC-4] The report is exportable to CSV and PDF. The CSV preserves all column headers and includes no merged cells. The PDF renders without truncating values in the Claim ID or ICD-10 columns.
[AC-5] The report is accessible only to users with the Billing Manager or Finance Admin role. Unauthorized users see no report option in their navigation menu.
[AC-6] Report generation for a single-month dataset of up to 50,000 records completes within 10 seconds. A loading indicator displays during generation.
Story Type 5: CI/CD Pipeline and DevOps Stories
Infrastructure, deployment, and pipeline stories rarely make it into Jira with any acceptance criteria at all. When they do, they’re usually written as tasks with no testable outcome. Treating them as stories with real AC forces the team to define what “working” means before they build it.
Set up automated deployment pipeline for the claims service.
[AC-1] A merge to the main branch triggers an automated build in Jenkins within 2 minutes. Failed builds send a Slack alert to the #claims-devops channel with the job name and failure reason.
[AC-2] The pipeline runs the unit test suite. If any unit test fails, the deployment to QA is blocked and the pipeline exits with code 1.
[AC-3] The pipeline runs the OWASP Dependency Check scan before deployment. Critical-severity vulnerabilities block the deployment. High-severity findings generate a warning notification without blocking.
[AC-4] A successful deployment to the QA environment completes within 8 minutes from merge trigger to deployment confirmation.
[AC-5] The deployment logs are retained in Jenkins for 30 days. Each log entry includes the build number, triggering commit hash, and deployment status.
Acceptance Criteria in Jira for Defects: A Different Standard
Defect tickets in Jira serve a different purpose than story tickets. The defect describes what went wrong. The acceptance criteria for a defect describe what “fixed” means. These are not the same as the original story’s AC – they are verification criteria specific to the broken behavior.
Most teams don’t write acceptance criteria on defect tickets at all. They log the steps to reproduce and the expected result, and assume that returning to expected behavior constitutes acceptance. That assumption fails when: the expected behavior isn’t clearly documented, the fix introduces a regression in a related area, the fix changes behavior in one environment but not another, or the Product Owner has a different definition of “fixed” than the developer.
Defect AC should appear in the “Acceptance Criteria” field (or the description section clearly labeled) and answer three questions: What specific behavior must now work correctly? What must not change as a result of this fix? How will the fix be verified across environments?
Defect Scenario 1: Data Truncation in Healthcare Claims Processing
[VC-1] ICD-10 codes of 8 characters (including decimal point) are transmitted through the XML transformation without truncation. Validated using a test claim containing code Z23.001 (8 chars).
[VC-2] The stored value in the diagnosis_code column matches the submitted ICD-10 code exactly, including the decimal point. Verify via direct SQL query against the claims_staging table.
[VC-3] Claims previously submitted with truncated codes are not retroactively modified. The fix applies only to new submissions.
[VC-4] No regression introduced to ICD-10 codes of 4 characters or fewer. Validated using test codes J45, E11, and I25.10.
[VC-5] Fix verified in DEV, promoted to QA, and regression tested against the full claims submission test suite before UAT promotion. QA sign-off required from QA Lead before the fix enters UAT.
[VC-3] is the kind of criterion that prevents a well-intentioned backfill from creating a data incident. The developer who fixes the transformation bug might reasonably wonder whether to reprocess historical records. Explicit VC prevents that decision from being made without a conversation.
Defect Scenario 2: API Timeout in Financial Transaction Processing
[VC-1] The API responds to a POST /transactions/submit payload containing 200 line items within 10 seconds. Measured end-to-end from request initiation to HTTP response receipt.
[VC-2] The API responds to payloads containing up to 500 line items within 15 seconds. This extended threshold applies to batch transactions only, as documented in the integration spec v2.3.
[VC-3] No HTTP 504 response is returned for any transaction payload within the documented size limits. Any timeout beyond the SLA must return HTTP 503 with a structured error body and a retry-after header.
[VC-4] Performance verified under load using a test harness that simulates 50 concurrent requests with 200-item payloads. No requests exceed the 10-second SLA under this load condition.
[VC-5] The fix does not degrade response time for transaction payloads of 50 items or fewer. Baseline response time for small payloads remains under 3 seconds.
[VC-2] acknowledges a real edge case: the original SLA may not have contemplated 500-item payloads. Rather than applying one blanket SLA, the criteria differentiate by transaction size. This prevents the developer from over-optimizing for the common case while breaking the edge case – or vice versa.
Defect Scenario 3: UI Bug in EHR Patient Demographics Screen
[VC-1] The Date of Birth field rejects any date that is later than today’s date. An inline error message displays: “Date of Birth cannot be in the future.”
[VC-2] The field also rejects dates earlier than January 1, 1900. Error message: “Please enter a valid Date of Birth.”
[VC-3] A valid date of birth (e.g., 1985-07-15) is saved and displayed correctly with no modification. The validation does not affect valid entries.
[VC-4] The validation is enforced at both the UI level (before submission) and the API level (POST /patients/demographics). A direct API call bypassing the UI is rejected with HTTP 400 and error code DOB_FUTURE_DATE.
[VC-5] Existing patient records with incorrect future DOBs are not automatically corrected by this fix. A separate data remediation ticket is required. This fix applies to new and edited submissions only.
[VC-4] catches a common gap: UI validation that isn’t backed by API-level enforcement. A malicious or misconfigured API client can still submit invalid data if the server doesn’t reject it. In a healthcare system, that’s a data integrity risk with HIPAA implications. Good defect AC doesn’t just describe what the screen does – it describes what the system enforces.
Defect Scenario 4: SQL Report Returning Incorrect Aggregation
[VC-1] Each encounter is counted exactly once in the claim count metric, regardless of the number of diagnosis codes attached to that encounter.
[VC-2] The report query is validated against the reference SQL in the Data Dictionary v4.2. The COUNT(DISTINCT encounter_id) grouping is used, not COUNT(diagnosis_code_id).
[VC-3] The fix is validated against a known dataset: Patient A with 3 visits on 2025-10-01, each with 2 diagnosis codes. The corrected report shows 3 claims, not 6.
[VC-4] The total billed amount in the corrected report matches the figure in the billing system’s transaction ledger for the same period, within a tolerance of $0.01 (rounding).
[VC-5] The fix is applied to the stored procedure sp_monthly_revenue_summary and version-controlled. The previous version is documented in the change log with the defect ticket number.
[VC-3] is the most important criterion here. It gives the developer a concrete test case with a known input and expected output. This is a directly executable SQL validation – not a conceptual description of the correct behavior. The developer runs the query. The result is either 3 or 6. There is no interpretation.
Acceptance Criteria in Jira for Non-Functional and Compliance Requirements
Non-functional requirements – performance, security, accessibility, data retention – generate acceptance criteria that no template handles well. They’re about thresholds, constraints, and audit-visible behaviors. Free-form AC is the only format that works cleanly.
Performance Acceptance Criteria
[AC-1] The patient search function returns results in under 2 seconds for 95% of queries under a concurrent load of 200 users.
[AC-2] Search queries returning zero results complete within the same 2-second threshold. No special handling is applied to empty result sets that would inflate the SLA metric.
[AC-3] Performance is validated using JMeter with a 200-user concurrency profile against the QA environment. The test script and results artifact are attached to this ticket before QA sign-off.
[AC-4] Response time degradation beyond the 2-second threshold triggers an alert in the monitoring dashboard. Alert configuration is included in the deployment checklist for this story.
Security and HIPAA Compliance Acceptance Criteria
[AC-1] All GET, POST, PUT, and DELETE requests to /api/v2/patients/* require a valid JWT token in the Authorization header. Requests without a token return HTTP 401.
[AC-2] Expired tokens return HTTP 401 with error code TOKEN_EXPIRED. Tokens from a revoked session return HTTP 403 with error code SESSION_REVOKED.
[AC-3] Every API call to /api/v2/patients/* is logged in the audit table with: user ID, action type, resource ID accessed, timestamp (UTC), and source IP. Logging occurs regardless of whether the request succeeds or fails.
[AC-4] Audit logs are retained for a minimum of 6 years per HIPAA retention requirements (45 CFR §164.530(j)). Deletion of audit log entries is not permitted by any application role.
[AC-5] Penetration testing of the /api/v2/patients/* endpoints is performed by the security team before UAT promotion. Results are attached to this story. Any critical findings block promotion.
Data Retention and Archiving Acceptance Criteria
[AC-1] Claims records with a Date of Service older than 7 years are moved to the claims_archive table during the nightly batch job. Records are removed from claims_active after successful archive verification.
[AC-2] The archive job verifies record count and checksum match between source and destination before deletion. Any mismatch aborts the deletion and generates an alert to the DBA on-call.
[AC-3] Archived records are retrievable within 5 business days upon authorized request. Retrieval is performed via the Admin Data Access tool, accessible only to users with the Data Steward role.
[AC-4] The archive process does not affect records with an open audit flag. Records flagged for compliance review remain in claims_active regardless of age until the flag is cleared.
[AC-5] The nightly batch job produces a log entry in batch_execution_log with: job name, run timestamp, records archived, records deleted, and status (Success / Partial / Failed).
Where to Put Acceptance Criteria in Jira and How to Structure the Field
Jira doesn’t have a native “Acceptance Criteria” field in its default configuration. Teams work around this in three main ways, each with real trade-offs.
Option 1: Acceptance Criteria Section in the Description Field
The simplest approach is a clearly labeled “Acceptance Criteria” section at the bottom of the story’s Description field, separated from the user story narrative by a heading. This requires no admin configuration and works in any Jira project type.
The downside is visibility. When the description field is long, the AC section scrolls off the visible area. QA analysts and developers miss it. Adding a Jira issue template that pre-populates the Description with a structured format (Story Description / Context / Acceptance Criteria / Notes) addresses this.
As a [role], I want [goal] so that [reason].
h2. Context / Background
[Relevant system context, dependencies, or regulatory notes]
h2. Acceptance Criteria
[AC-1] …
[AC-2] …
[AC-3] …
h2. Out of Scope
[What this story does NOT include]
h2. Links / References
[Related stories, requirements documents, data specs]
Option 2: Custom Paragraph Field for Acceptance Criteria
A Jira admin can add a custom paragraph-type field named “Acceptance Criteria” to the story screen. This separates AC from the story description, makes it easier to find, and allows Jira JQL filters to check whether the field is empty – flagging stories that haven’t been refined with AC before sprint planning.
JQL example: issuetype = Story AND "Acceptance Criteria" is EMPTY AND sprint in openSprints(). Run this before sprint planning and you’ll see exactly which stories need AC written before the team can commit to them.
The limitation: this field is still unstructured text. It doesn’t support progress tracking or automation triggers based on individual criterion completion.
Option 3: Checklist Plugin (Smart Checklist or Issue Checklist for Jira)
Checklist plugins add structured, checkable AC items to Jira issues. Each criterion becomes an actionable checklist item that can be marked Developed or Tested individually. Jira automation can then trigger workflow transitions – such as moving a story to “Ready for Release” – only when all mandatory AC items are checked.
This is the most powerful option for teams that want AC to drive workflow, not just document requirements. The trade-off is setup cost and the need to maintain plugin licenses. For teams with 20+ stories per sprint, the workflow automation benefit is worth it. For small teams, the Description field approach works fine.
| Option | Setup Required | Progress Tracking | Automation-Ready | Best For |
|---|---|---|---|---|
| Description Section | None – use template | None | No | Small teams, quick setup, simple stories |
| Custom Paragraph Field | Jira Admin required | None | Partial (empty-field JQL) | Mid-size teams wanting AC separated from description |
| Checklist Plugin | Plugin install + config | Per-criterion status | Yes – workflow triggers | Large programs, compliance-heavy environments, audit trails |
Who Writes Acceptance Criteria in Jira and When
The ownership question creates friction on almost every team. The Product Owner is accountable for the backlog and its content – which means they’re accountable for AC quality. But POs often lack the technical specificity to write API validation rules, SQL constraints, or security criteria. The Business Analyst fills that gap in most programs. In practice, the BA writes AC during or after requirements analysis and the PO reviews and accepts them before the story enters refinement.
The timing matters. AC written during sprint planning are written under time pressure, often by consensus, and rarely testable. AC written during backlog refinement – one or two sprints ahead of delivery – are written with enough lead time for QA to review them and flag gaps before the build starts. That’s the sequence that works: BA drafts, PO approves, QA reviews during refinement, team estimates.
In SAFe environments, AC are part of the story at PI planning. They’re expected to be present before teams commit stories to a Program Increment. The SAFe guidance on Story requirements states that AC should be defined as part of the story before it enters a sprint – not during it.
Common Acceptance Criteria Mistakes in Jira and How to Fix Them
The mistakes that appear most often on real programs fall into predictable patterns. Each one has a specific fix.
Mistake 1: AC That Describe Intent, Not Behavior
Mistake 2: AC Written as Implementation Instructions
The AC describes what the system does. The developer decides how. AC that prescribe implementation constrain the solution unnecessarily and blur the line between requirements and technical design.
Mistake 3: AC That Combine Multiple Conditions in One Statement
[AC-1] The date field rejects dates later than today’s date. Error: “Date cannot be in the future.”
[AC-2] The date field rejects dates earlier than January 1, 1900. Error: “Please enter a valid date.”
[AC-3] The date field is mandatory. Submitting without a date blocks submission and displays: “Date is required.”
Mistake 4: No Negative Test Cases
Most teams write AC for the happy path only. They describe what happens when the user does everything correctly and omit what happens when they don’t. Every story has a failure mode. Every defect fix has a regression risk. Both need AC.
For every positive criterion (“The system accepts valid ICD-10 codes”), add the corresponding negative (“The system rejects invalid ICD-10 codes and displays the appropriate error message”). The QA team will test both. The AC should reflect that.
Mistake 5: AC Written After Development Starts
If a developer has already built the feature before AC are written, the AC will document what was built – not what was needed. That’s not acceptance criteria. That’s a feature description. Writing AC after development is the single most reliable way to guarantee that UAT fails, because the business sees what was built and discovers it doesn’t match what they expected.
The fix is procedural: no story enters a sprint without complete, approved AC. The Jira custom field approach and the Definition of Ready both enforce this. Teams that struggle with this consistently have a refinement cadence problem, not an AC writing problem.
Acceptance Criteria and the Testing Lifecycle in Jira
Acceptance criteria are the input to the Software Testing Life Cycle. Every test case derived from AC should link back to the criterion it validates. In Jira with Xray or Zephyr Scale, this linkage is explicit – test cases link to the parent story, and test execution results show which AC passed and which failed.
Without that traceability, QA reports say “Story PROJ-123 FAILED” without explaining which criterion failed. The developer fixes what they guess is the problem. The QA analyst retests. Often, it’s the wrong thing. Numbered AC – [AC-1] through [AC-N] – combined with test case labels that reference those numbers eliminate this ambiguity entirely.
For defect tickets specifically, the verification criteria (VC) serve as the retest checklist. A QA analyst who retests a defect without checking each VC individually may miss a regression introduced by the fix. The VC aren’t formality – they’re the structured guide to confirming the fix is complete and didn’t break anything adjacent.
ISTQB Advanced Level Test Manager guidance treats acceptance criteria as the basis for acceptance testing – particularly User Acceptance Testing (UAT). When UAT testers encounter a feature without clear AC, they test based on memory, assumption, or what “feels right.” The failure rate in UAT is directly proportional to the quality of AC written before development started. This isn’t correlation – it’s causation.
Edge Cases: When Standard AC Approaches Break Down
Ideal AC writing assumes stable requirements, available BAs, and time for refinement. Real programs rarely deliver all three simultaneously.
Legacy systems with undocumented behavior. When integrating with a legacy system that has no documentation, the “expected behavior” in the AC may not be known until the integration is tested. In this case, write what’s known as the AC, explicitly label the assumptions (“Based on the legacy API documentation from 2019”), and add a “Verify with legacy system owner before sprint start” note. Undocumented behavior is a risk item, not a reason to skip AC.
Rapidly changing requirements mid-sprint. When a requirement changes after AC are written and the sprint has started, the AC must be updated in Jira before the developer proceeds. A story with outdated AC is a story being built to the wrong spec. The update requires PO approval and a note in the AC history. This is a scope change and should be treated as one.
Stories that are too large to have complete AC at planning. Some stories are pulled into sprints before full requirements clarity exists. The honest answer is that these stories shouldn’t be in the sprint. The pragmatic answer is: write AC for the known scope, explicitly list what is out of scope, and add a “Risk: AC is incomplete pending [specific clarification]” note. At least then everyone knows the risk going in.
Defects where the root cause isn’t clear yet. Logging a defect before the root cause is identified is common. In this case, the verification criteria can’t be complete until the investigation is done. Write preliminary VC based on the observed failure, add “[PRELIMINARY – update after root cause analysis]” to the VC section, and update it once the investigation concludes. Incomplete VC are better than no VC – they at least communicate what is known.
Acceptance Criteria in Jira: AC vs. Definition of Done
Teams frequently confuse acceptance criteria with the Definition of Done. They serve different functions. AC are specific to one issue. The Definition of Done is a shared quality gate applied to every issue of a given type.
| Dimension | Acceptance Criteria (AC) | Definition of Done (DoD) |
|---|---|---|
| Scope | One story or defect | All stories in the project |
| Written By | BA / PO per story | Team agreement at project start |
| Content | Specific, testable functional outcomes | Code review, unit tests, documentation, PO approval |
| Changes Per Story | Yes – unique per issue | No – consistent across sprint |
| Jira Location | Description field or custom AC field | Team wiki, Confluence, or sprint template |
| Validation By | QA + Product Owner | Scrum Master + team in sprint review |
A story can meet every AC and still not satisfy the DoD if, for example, no peer code review was done. Conversely, a story that passes the DoD checklist but fails AC-3 isn’t done – it just met the quality process requirements without meeting the functional ones. Both must pass. They’re not interchangeable.
Connecting Acceptance Criteria to the Broader SDLC
Acceptance criteria aren’t just a Jira field – they’re a connective layer across the Software Development Life Cycle. They start at requirements (the BA writes them from business rules). They inform development (the developer builds to them). They drive testing (QA derives test cases from them). They control deployment (the PO signs off when all are met). They create the audit trail (compliance teams trace implemented features back to their verified criteria).
In a regulated environment – healthcare, financial services, government – that audit trail is not optional. FDA 21 CFR Part 11 for medical device software, HIPAA for patient data systems, and SOX for financial reporting all require documented evidence that the system was built and verified against defined requirements. Jira acceptance criteria – when written clearly and verified explicitly – are that evidence.
Teams that treat AC as a checkbox item (“write something in the field so the ticket can go to Sprint Ready”) produce audit exposure without the protection. Teams that treat AC as the contract between requirements and delivery produce systems that pass UAT, satisfy compliance reviews, and generate fewer production defects per release cycle. The investment is in the writing. The return is in the testing and the audit.
Take one active sprint story right now and read its acceptance criteria. Ask three questions: Can a QA analyst execute a test against each criterion without asking for clarification? Does each criterion cover exactly one testable condition? Is there at least one negative test case per functional area? If any answer is no, fix the AC before the sprint continues. A 20-minute rewrite in refinement prevents a two-day rework cycle in UAT.
Suggested External References:
1. BABOK v3 – Business Analysis Body of Knowledge, IIBA (iiba.org)
2. Acceptance Criteria: Definition, Examples & Tips – Atlassian (atlassian.com)
