The Scrum Framework: A QA Perspective

Scrum is the most widely adopted agile framework, used by an estimated 66% of agile teams worldwide. As a QA engineer, you will almost certainly work in a Scrum environment at some point in your career. Understanding how testing fits into Scrum is not optional — it is essential.

This lesson covers Scrum from a tester’s perspective: not just what the framework is, but how you actively contribute to every ceremony and artifact.

Scrum Roles and QA

Scrum defines three roles. Understanding each one helps you know who to collaborate with and when.

Product Owner (PO)

The Product Owner manages the product backlog and prioritizes work based on business value. As a QA engineer, you work closely with the PO to:

  • Clarify acceptance criteria for user stories
  • Identify edge cases the PO may not have considered
  • Provide risk assessments that influence prioritization
  • Demo tested features during Sprint Review

Scrum Master (SM)

The Scrum Master facilitates the process and removes impediments. From a QA perspective, the SM helps when:

  • Test environments are unavailable or unstable
  • Dependencies between teams block testing
  • The team needs to improve testing practices (raised in retrospectives)

Development Team

In Scrum, the Development Team is cross-functional. This means QA engineers are full members of the Development Team, not a separate group. You share responsibility for delivering a done increment every sprint.

Key insight: In Scrum, there is no separate “QA team.” Testers are embedded within the Development Team. The whole team owns quality.

Scrum Events and Testing Activities

Scrum defines five events. QA engineers participate in all of them.

graph LR subgraph Sprint SP[Sprint Planning] --> DS[Daily Standup] DS --> DEV[Development & Testing] DEV --> DS DEV --> SR[Sprint Review] SR --> RT[Sprint Retrospective] end RT --> SP style SP fill:#4CAF50,color:#fff style DS fill:#2196F3,color:#fff style DEV fill:#FF9800,color:#fff style SR fill:#9C27B0,color:#fff style RT fill:#F44336,color:#fff

Sprint Planning

Duration: Up to 8 hours for a 4-week sprint (proportionally less for shorter sprints).

QA activities during Sprint Planning:

  • Review user stories and acceptance criteria
  • Estimate testing effort for each story
  • Identify test dependencies (data, environments, third-party services)
  • Flag stories that carry high risk and may need extra testing
  • Help the team determine how much work they can commit to

QA tip: If acceptance criteria are vague, ask questions during planning — not two days before the sprint ends.

Daily Standup

Duration: 15 minutes maximum.

QA contribution:

  • Report which stories you are currently testing
  • Raise blockers (environment down, missing test data, unclear requirements)
  • Communicate which stories passed or failed testing
  • Coordinate with developers on bug fixes

Anti-pattern to avoid: The standup is not a status report to the Scrum Master. It is a synchronization meeting for the team. Speak to your teammates, not to the SM.

Sprint Review

Duration: Up to 4 hours for a 4-week sprint.

QA activities:

  • Demo tested features to stakeholders
  • Present test coverage metrics and quality indicators
  • Highlight risks in features that were not fully tested
  • Gather feedback that may generate new test scenarios

Sprint Retrospective

Duration: Up to 3 hours for a 4-week sprint.

QA topics to raise:

  • Testing bottlenecks that slowed the sprint
  • Quality improvements (automation opportunities, environment stability)
  • Collaboration issues between developers and testers
  • Process improvements for the next sprint

Testing Throughout the Sprint

A common misconception is that testing happens only after development is complete. In Scrum, testing is a continuous activity:

gantt title Sprint Timeline with Testing Activities dateFormat X axisFormat %s section Day 1-2 Refine acceptance criteria :a1, 0, 2 Write test cases :a2, 0, 2 section Day 3-7 Test completed stories :a3, 2, 7 Exploratory testing :a4, 3, 7 section Day 8-9 Regression testing :a5, 7, 9 Bug verification :a6, 7, 9 section Day 10 Final validation :a7, 9, 10

Days 1-2: Write test cases and prepare test data while developers start coding. Review acceptance criteria with the PO.

Days 3-7: Test stories as they are completed. Do not wait for all stories to be done. Test each one as developers mark it ready.

Days 8-9: Run regression tests. Verify bug fixes. Perform exploratory testing on integrated features.

Day 10: Final validation, update test documentation, prepare for Sprint Review.

The Definition of Done (DoD)

The Definition of Done is a checklist that every product backlog item must satisfy before it is considered complete. It is agreed upon by the entire team and applies to every story.

A Strong Definition of Done Includes Quality Criteria

Here is an example DoD with quality-focused items highlighted:

CriterionCategory
Code written and peer-reviewedDevelopment
Unit tests written and passing (≥80% coverage)Quality
All acceptance criteria verified by QAQuality
No critical or high-severity bugs openQuality
Exploratory testing performedQuality
Regression tests passingQuality
Feature deployed to staging environmentDeployment
Documentation updatedDocumentation
Product Owner accepted the storyAcceptance

Why the DoD Matters for QA

Without a clear DoD, “done” means different things to different people. A developer might consider a story done when the code compiles. A PO might consider it done when it looks right in a demo. The DoD eliminates this ambiguity.

Your responsibility as QA: Advocate for quality criteria in the DoD. If the team’s DoD does not include testing, it is incomplete.

User Stories and Acceptance Criteria

In Scrum, requirements are expressed as user stories with acceptance criteria. These are the foundation of your test cases.

User story format:

As a [user role],
I want to [action],
So that [benefit].

Acceptance criteria example:

Given I am on the login page
When I enter valid credentials
Then I should be redirected to the dashboard
And I should see a welcome message with my name

QA enhancement: For every acceptance criterion, think about the negative cases too:

  • What happens with invalid credentials?
  • What happens with empty fields?
  • What happens with special characters in the username?

These negative cases should become additional test cases.

Scrum Artifacts and QA

Product Backlog

The prioritized list of everything that might be needed in the product. QA contributes by:

  • Adding testing-related stories (e.g., “Set up performance testing infrastructure”)
  • Flagging technical debt related to testability
  • Reviewing upcoming stories to plan testing approach

Sprint Backlog

The set of items selected for the current sprint plus the plan for delivering them. QA tasks (write test cases, execute tests, automate scenarios) should be visible in the sprint backlog.

Product Increment

The sum of all completed product backlog items at the end of a sprint. This increment must meet the Definition of Done and be potentially releasable.

Exercise: Create a Testing Plan for a Sprint

You are a QA engineer on a Scrum team. Your team works in 2-week sprints. The Product Owner has selected the following stories for the upcoming sprint:

User Stories:

  1. US-101: As a user, I want to reset my password via email so that I can regain access to my account (8 story points)
  2. US-102: As an admin, I want to export user data as CSV so that I can analyze usage patterns (5 story points)
  3. US-103: As a user, I want to receive push notifications for new messages so that I stay informed (13 story points)
  4. US-104: As a user, I want to filter search results by date range so that I find relevant content faster (3 story points)

Your task:

Create a sprint testing plan that includes:

  1. Testing approach for each story (what types of testing you will perform)
  2. Test dependencies and prerequisites
  3. A day-by-day testing schedule across the 10-day sprint
  4. Risk assessment: which stories carry the highest risk and why
  5. Your contribution to the Definition of Done
Hint

Consider these factors for your plan:

  • Story points indicate relative complexity — US-103 (13 points) is the most complex
  • Password reset involves security testing (not just functional)
  • CSV export needs data validation testing
  • Push notifications require testing across multiple platforms
  • Date filter needs boundary value testing

Start by identifying the riskiest story and allocate more testing time to it.

Sample Solution

Sprint Testing Plan

1. Testing Approach by Story:

StoryTesting Types
US-101 (Password Reset)Functional, security, email integration, negative testing, cross-browser
US-102 (CSV Export)Functional, data validation, performance (large datasets), format verification
US-103 (Push Notifications)Functional, cross-platform (iOS/Android/Web), timing, offline scenarios
US-104 (Date Filter)Functional, boundary value, UI/UX, performance with large result sets

2. Dependencies:

  • US-101: Test email server configuration, test accounts with known credentials
  • US-102: Sample dataset with 10K+ records for performance testing
  • US-103: Test devices (iOS and Android), push notification service credentials
  • US-104: Database with records spanning various date ranges

3. Day-by-Day Schedule:

DayActivity
1Write test cases for US-104 (simplest, likely done first) and US-101
2Write test cases for US-102 and US-103. Set up test data.
3-4Test US-104 (expected to be dev-complete by day 3). Begin testing US-101.
5-6Complete US-101 testing. Begin US-102 testing.
7-8Test US-103 (complex, expected dev-complete by day 7). Cross-platform testing.
9Regression testing. Bug verification. Exploratory testing on integrated features.
10Final validation. Update test documentation. Prepare Sprint Review demo.

4. Risk Assessment:

StoryRisk LevelReason
US-103HIGHMost complex (13 pts), cross-platform, external service dependency
US-101HIGHSecurity-sensitive, email delivery reliability
US-102MEDIUMData accuracy concerns with large datasets
US-104LOWSimple feature, well-defined scope

5. Definition of Done Contribution:

  • All acceptance criteria verified with passing test cases
  • Negative/edge case testing completed for each story
  • No open critical or high-severity bugs
  • Cross-browser testing performed (US-101)
  • Cross-platform testing performed (US-103)
  • Regression suite passing
  • Test cases documented and linked to stories

Sprint Testing Anti-Patterns

Knowing what not to do is as important as knowing what to do. Here are the most common sprint testing anti-patterns:

1. The Mini-Waterfall

Problem: All development happens in the first half of the sprint, all testing in the second half. This creates a testing bottleneck and guarantees bugs are found late.

Solution: Developers and testers work in parallel. Test stories as they are completed, not at the end.

2. The Skipped Regression

Problem: The team skips regression testing because “we only changed one thing.” That one change breaks three other features.

Solution: Always run regression tests, even for small changes. Automate regression tests to make this painless.

3. The Invisible Tester

Problem: QA does not participate in Sprint Planning or Retrospective. Testing is treated as an afterthought.

Solution: QA must attend and actively contribute to all Scrum events. If you are excluded, raise it as an impediment.

4. The “Done” That Isn’t Done

Problem: Stories are marked as “done” when code is written, skipping testing. QA finds critical bugs in supposedly done stories at the end of the sprint.

Solution: Enforce the Definition of Done. A story is not done until all DoD criteria, including testing, are met.

5. The Bug Sprint

Problem: The team spends an entire sprint fixing bugs from the previous sprint instead of delivering new value.

Solution: Fix bugs as they are found, within the same sprint. Include bug-fixing time in sprint capacity planning.

Pro Tips for QA in Scrum

  1. Pair with developers. When a developer finishes a feature, sit together for 10 minutes. They demo it to you, you ask questions. This catches obvious issues before formal testing.

  2. Automate the Definition of Done. If your DoD requires passing unit tests and code coverage checks, automate these checks in the CI pipeline. The DoD should be verifiable, not aspirational.

  3. Track the “escaped defect” metric. Count how many bugs are found in production versus in the sprint. A decreasing trend means your sprint testing is improving.

  4. Refine stories before the sprint. Participate in backlog refinement sessions. The more you clarify requirements before Sprint Planning, the fewer surprises during testing.

  5. Build a sprint testing checklist. Create a reusable checklist for each sprint: test environment ready, test data prepared, regression suite updated. Consistency prevents oversights.