Lessons learnt integrating test into the agile lifecycle Fran O’Hara – Inspire Quality Services Fran.ohara@inspireqs.ie www.inspireqs.ie © 2015 Inspire Quality Services 1
2 Roles • Product Owner • ScrumMaster • Development Team Events • Sprint planning • Sprint review/demo • Sprint retrospective • Daily scrum meeting • (Backlog refinement/grooming) Artifacts • Product backlog • Sprint backlog • Burndown Charts • Definition of Done Rules Each component within the framework serves a specific purpose and is essential to Scrum’s success and usage. Scrum Guide @ Scrum.org
3
4 Beware Scrum ‘Developer’!!! DevelopmentAnalysis Testing Tester
5 Test Managers?
From: Lisa Crispin, 2011 6
PO,TM,SM(All) All All All All, CUAll WEEK1 TM WEEK2 PO: Product Owner – SM: ScrumMaster - TM: Development Team – CU: Customer 7 Each Event is Timeboxed. Times provided are maximum times from the Scrum Guide at scrum.org based on a 1 month sprint. Each event is an opportunity to Inspect and Adapt
Quality & Test 8 • Quality is not equal to test. Quality is achieved by putting development and testing into a blender and mixing them until one is indistinguishable from the other. • Testing must be an unavoidable aspect of development, and the marriage of development and testing is where quality is achieved. from ‘How google tests software’, James Whittaker et. al. © 2015 Inspire Quality Services
Is testing fully integrated? Code Code Code & Bug Fix Test Sprint 1 Sprint 2 Code Test Sprint 1 Sprint 2 Code & Bug Fix Code Test Code & Bug Fix Code & Bug Fix Test Sprint 1 Sprint 2 Code & Bug Fix Test A B C 9
Achieving Scenario C 10 Prerequisites: Test Driven at ‘Acceptance’ level (story level) Small stories • ½ - 6 person days effort as a guide Prioritised and implemented in sequence (e.g. 2-3 at a time) • including test execution . . . . Sprint Backlog © 2015 Inspire Quality Services No ‘verified’ column!
User Story Example – Hotel Reservation Reservation Cancellation As a user I want to cancel a reservation so that I avoid being charged full rate Confirmation: • Verify a premium member can cancel the same day without a fee • Verify a non-premium member is charged 10% for same day cancellation but otherwise not charged • Verify an email confirmation is sent to user with appropriate information • Verify that the hotel is notified within 10 minutes of a cancellation CONVERSATION: • What if I am a premium member – do I have charges? • When is a non-premium member charged and how much? • How do these vary depending on when cancellation occurs? • Do we need to send the user confirmation by email? • When does the hotel need to be notified? • What if the user has paid a deposit? 11
Release/feature planning level – a testing perspective Add value in release (re-)planning by: • Supporting the Product Owner in writing User Stories/Epics and making sure they are testable, • Participating in the high level risk analysis of those User Stories/Epics, • Ensuring Estimation includes testing perspective • Planning the testing for the release/feature level. That is, to create a test strategy/approach for it (resources, tools, test levels, static testing, test environments, test automation targets), based on the scope and risks identified for that release/feature • Based on an evolving product backlog • Playing a key role in defining the definition of done of the release, and later on of the iteration/Sprint. 12 Adapted from ISTQB Agile Tester Extension Syllabus
13 An acceptance test is a formal description of the behaviour of a software product, generally expressed as an example or a usage scenario. .. - in many cases the aim is that it should be possible to automate the execution of such tests by a software tool, either ad-hoc to the development team or off the shelf. - Similarly to a unit test, an acceptance tests is generally understood to have a binary result, pass or fail; - For many Agile teams acceptance tests are the main form of functional specification; sometimes the only formal expression of business requirements. .. Also known as • The terms "functional test", "acceptance test" and "customer test" are used more or less interchangeably. • A more specific term "story test", referring to user stories is also used, as in the phrase "story test driven development". (Agile Alliance) ‘Acceptance’ Testing in Agile © 2015 Inspire Quality Services
‘Acceptance’ Testing – is it enough? • May not be…context/risk/strategy issue… – Expand to fuller ‘system’ tests • Functional testing • Non-functional testing – performance, usability, etc. – May still need more user story interaction tests, epic/feature level testing, workflows, end-to-end business scenario focused User Acceptance Test, etc. – System integration testing issues – Etc. • Strategy and scheduling issue – Risk-driven, adaptive 14 © 2015 Inspire Quality Services
Maintaining Context PRIORITY GRANULARITY 15
Agile Testing Quadrants – Risk! 16
Is testing fully integrated? Code & Bug Fix Test Sprint 1 17 Code & Bug Fix Test Sprint 2 Code & Bug Fix Test Sprint 3 Potentially Releasable …… Potentially Releasable Potentially Releasable Actual release (MMF) • Initial Backlog • Release and Test Planning Functional: Unit, component integration, story acceptance, story interaction, exploratory, etc. + Feature/system/system integration… and Non-functional: …… © 2015 Inspire Quality Services
Definition of ‘Done’ 18 An agreement between PO and the Team • Evolving over time to increase quality & ‘doneness’ Used to guide the team in estimating and doing ‘Done’ may apply to a Product Backlog Item (PBI) and to an Increment Used by the PO to increase predictability and accept Done PBIs A single DoD may apply across an organisation, or a product • Multiple teams on a product share the DoD © 2015 Inspire Quality Services
DoD example 19 Story level •Unit tests passed, •unit tests achieving 80% decision coverage, •Integration tests passed •acceptance tests passed with traceability to story acceptance criteria, •code and unit tests reviewed, •static analysis has no important warnings, •coding standard compliant, •published to Dev server Sprint level •Reviewed and accepted by PO, •E-2-E functional and feature tests passed •all regression tests passing, •exploratory testing completed, •performance profiling/benchmarking complete, •bugs committed in sprint resolved, •deployment/release docs updated and reviewed, •user manual updated Release level • Released to Stage server, • Deployment tests passed, • Deployment/release docs delivered, • large scale integration performance/stress testing passed © 2015 Inspire Quality Services
Conclusions on lessons learnt 20 © 2015 Inspire Quality Services • Prevention as well as detection • Avoiding the mini-waterfall • Activity versus artefact • Test competence in the team • Role of (test) management
Fran O’Hara InspireQS www.inspireqs.ie fran.ohara@inspireqs.ie 21 © 2015 Inspire Quality Services
Technical Debt Symptoms of technical debt • Bugs found in production • Incomprehensible, un-maintainable code • Insufficient or un-maintainable automated tests • Lack of CI • Poor internal quality • Etc. 22

Lessons learnt Integrating Test into the Agile Lifecycle

  • 1.
    Lessons learnt integratingtest into the agile lifecycle Fran O’Hara – Inspire Quality Services Fran.ohara@inspireqs.ie www.inspireqs.ie © 2015 Inspire Quality Services 1
  • 2.
    2 Roles • Product Owner •ScrumMaster • Development Team Events • Sprint planning • Sprint review/demo • Sprint retrospective • Daily scrum meeting • (Backlog refinement/grooming) Artifacts • Product backlog • Sprint backlog • Burndown Charts • Definition of Done Rules Each component within the framework serves a specific purpose and is essential to Scrum’s success and usage. Scrum Guide @ Scrum.org
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
    PO,TM,SM(All) All All All All, CUAll WEEK1 TM WEEK2 PO: Product Owner– SM: ScrumMaster - TM: Development Team – CU: Customer 7 Each Event is Timeboxed. Times provided are maximum times from the Scrum Guide at scrum.org based on a 1 month sprint. Each event is an opportunity to Inspect and Adapt
  • 8.
    Quality & Test 8 •Quality is not equal to test. Quality is achieved by putting development and testing into a blender and mixing them until one is indistinguishable from the other. • Testing must be an unavoidable aspect of development, and the marriage of development and testing is where quality is achieved. from ‘How google tests software’, James Whittaker et. al. © 2015 Inspire Quality Services
  • 9.
    Is testing fullyintegrated? Code Code Code & Bug Fix Test Sprint 1 Sprint 2 Code Test Sprint 1 Sprint 2 Code & Bug Fix Code Test Code & Bug Fix Code & Bug Fix Test Sprint 1 Sprint 2 Code & Bug Fix Test A B C 9
  • 10.
    Achieving Scenario C 10 Prerequisites: TestDriven at ‘Acceptance’ level (story level) Small stories • ½ - 6 person days effort as a guide Prioritised and implemented in sequence (e.g. 2-3 at a time) • including test execution . . . . Sprint Backlog © 2015 Inspire Quality Services No ‘verified’ column!
  • 11.
    User Story Example– Hotel Reservation Reservation Cancellation As a user I want to cancel a reservation so that I avoid being charged full rate Confirmation: • Verify a premium member can cancel the same day without a fee • Verify a non-premium member is charged 10% for same day cancellation but otherwise not charged • Verify an email confirmation is sent to user with appropriate information • Verify that the hotel is notified within 10 minutes of a cancellation CONVERSATION: • What if I am a premium member – do I have charges? • When is a non-premium member charged and how much? • How do these vary depending on when cancellation occurs? • Do we need to send the user confirmation by email? • When does the hotel need to be notified? • What if the user has paid a deposit? 11
  • 12.
    Release/feature planning level– a testing perspective Add value in release (re-)planning by: • Supporting the Product Owner in writing User Stories/Epics and making sure they are testable, • Participating in the high level risk analysis of those User Stories/Epics, • Ensuring Estimation includes testing perspective • Planning the testing for the release/feature level. That is, to create a test strategy/approach for it (resources, tools, test levels, static testing, test environments, test automation targets), based on the scope and risks identified for that release/feature • Based on an evolving product backlog • Playing a key role in defining the definition of done of the release, and later on of the iteration/Sprint. 12 Adapted from ISTQB Agile Tester Extension Syllabus
  • 13.
    13 An acceptance testis a formal description of the behaviour of a software product, generally expressed as an example or a usage scenario. .. - in many cases the aim is that it should be possible to automate the execution of such tests by a software tool, either ad-hoc to the development team or off the shelf. - Similarly to a unit test, an acceptance tests is generally understood to have a binary result, pass or fail; - For many Agile teams acceptance tests are the main form of functional specification; sometimes the only formal expression of business requirements. .. Also known as • The terms "functional test", "acceptance test" and "customer test" are used more or less interchangeably. • A more specific term "story test", referring to user stories is also used, as in the phrase "story test driven development". (Agile Alliance) ‘Acceptance’ Testing in Agile © 2015 Inspire Quality Services
  • 14.
    ‘Acceptance’ Testing –is it enough? • May not be…context/risk/strategy issue… – Expand to fuller ‘system’ tests • Functional testing • Non-functional testing – performance, usability, etc. – May still need more user story interaction tests, epic/feature level testing, workflows, end-to-end business scenario focused User Acceptance Test, etc. – System integration testing issues – Etc. • Strategy and scheduling issue – Risk-driven, adaptive 14 © 2015 Inspire Quality Services
  • 15.
  • 16.
  • 17.
    Is testing fullyintegrated? Code & Bug Fix Test Sprint 1 17 Code & Bug Fix Test Sprint 2 Code & Bug Fix Test Sprint 3 Potentially Releasable …… Potentially Releasable Potentially Releasable Actual release (MMF) • Initial Backlog • Release and Test Planning Functional: Unit, component integration, story acceptance, story interaction, exploratory, etc. + Feature/system/system integration… and Non-functional: …… © 2015 Inspire Quality Services
  • 18.
    Definition of ‘Done’ 18 Anagreement between PO and the Team • Evolving over time to increase quality & ‘doneness’ Used to guide the team in estimating and doing ‘Done’ may apply to a Product Backlog Item (PBI) and to an Increment Used by the PO to increase predictability and accept Done PBIs A single DoD may apply across an organisation, or a product • Multiple teams on a product share the DoD © 2015 Inspire Quality Services
  • 19.
    DoD example 19 Story level •Unittests passed, •unit tests achieving 80% decision coverage, •Integration tests passed •acceptance tests passed with traceability to story acceptance criteria, •code and unit tests reviewed, •static analysis has no important warnings, •coding standard compliant, •published to Dev server Sprint level •Reviewed and accepted by PO, •E-2-E functional and feature tests passed •all regression tests passing, •exploratory testing completed, •performance profiling/benchmarking complete, •bugs committed in sprint resolved, •deployment/release docs updated and reviewed, •user manual updated Release level • Released to Stage server, • Deployment tests passed, • Deployment/release docs delivered, • large scale integration performance/stress testing passed © 2015 Inspire Quality Services
  • 20.
    Conclusions on lessonslearnt 20 © 2015 Inspire Quality Services • Prevention as well as detection • Avoiding the mini-waterfall • Activity versus artefact • Test competence in the team • Role of (test) management
  • 21.
  • 22.
    Technical Debt Symptoms oftechnical debt • Bugs found in production • Incomprehensible, un-maintainable code • Insufficient or un-maintainable automated tests • Lack of CI • Poor internal quality • Etc. 22