Testing Your Testing Program Presenters Claire Schmitt, Sr. Director Optimization Consulting, Brooks Bell David DeFranza, Director, Optimization Consulting, Brooks Bell Hudson Arnold, Strategic Partnerships, Optimizely
Today’s Speakers Claire Schmitt Senior Director Optimization Consulting Brooks Bell David DeFranza Director, Optimization Consulting Brooks Bell
Housekeeping • We are recording • Slides will be available after the webinar is complete • There will be time to submit questions at the end of the presentation
Agenda • Overview • The Brooks Bell Maturity Model • Key questions to ask about your own program • Testing program problems + solutions • Q&A
Brooks Bell is a premier experimentation consultancy offering world-class brands a flexible, consulting-based operational model. Our team of testing experts create custom, data-driven strategies that help companies solve problems facing their business. Who we are Program Advisement Experimentation Training Full Service Testing Resource Augmentation
Building a testing program is hard Testing requires a lot of time and attention. As a result, testing managers tend to focus on daily tasks and not overall program performance.
We get tests out the door, but... • Are we generating impact consistently? • Does each test contribute to a growing body of knowledge? • Is our process as efficient as it could be? • Are we consistently increasing the skills of our team? • Has participation across the organization increased? • Does data drive our decision-making?
The Brooks Bell Maturity Model
Digital Experimentation Maturity Model 6 Critical Testing Program Pillars Culture Team Process Tech & Tools Strategy Performance 6 Key Program Pillars
Culture Team Process Tech & Tools Strategy Performance 6 Key Program Pillars Culture Categories • Demand • Buy-in • Motivation • Testing resources/support • Levels of engagement • Training/education Team Categories • Testing organization structure • Testing roles/responsibilities • Skillset • Resource availability • Executive sponsorship Team ProcessProcess Tech & Tools Strategy Performance Team Culture Process Categories • Experiment design • Execution process • Documentation • Communication • Reporting + sharing • Standardization • Test coordination • Queue prioritization Team Process Tech & Tools ProcessProcess Tech + Tools Categories • Tool stack • Data quality • Integration with existing systems • Tool feature/functionality • Tool utilization Tech & Tools Strategy Strategy Categories • Methodology • Data usage • Testing locations • Ideation maturity • Customer centricity • Journey • Iterative testing Performance Strategy Performance Categories • Goals • % capacity testing • Program measurement • Impact measurement • Results/Insights tracking
Digital Experimentation Maturity Landscape 1 2 3 4 5 LEVEL 1 • Testing program in infancy • Goals may not be defined • Testing org structure not established yet • No processes or standardization • Strategy not informed by data • Testing and analytics tools in place LEVEL 2 • Testing is generally accepted • Goals and metrics established • Org Structure in place, testing skills inconsistent • Basic processes in place • Strategy inconsistently informed by data • Testing and analytics tools integrated LEVEL 3 • People want to test and community is forming • Program focused on maximizing opportunities • Resource(s) are dedicated to testing • Standard, comprehensively managed process • All strategies informed by data • Some features of tools are being maximized LEVEL 4 • Testing informs majority of site decisions • Testing focused on impact • Cross-functional, experienced resources • Processes adopted and optimized for efficiency • All strategies informed by qual and quant data • Utilization of tool features maximized LEVEL 5 • Testing and data is innate • Testing impact and opportunities maximized • All individuals are versed in all aspects of testing • Processes is flexible for rapid deployment • Comprehensive consumer behavior strategy • Utilizing advanced features within tool stack
Level 0 1 2 3 4 5 Getting to a level 1 is a true sign of success
Evaluating your own testing program
We need to stop and ask ourselves... STRATEGY • Is there a standard process for generating test ideas? • Are test ideas typically informed by data? • Is the ideation process is widely adopted and consistently applied? TEAM • Is there basic knowledge of and experience with testing? • Are resources at least partially dedicated to testing? • Is an experienced, multi-disciplinary testing team is in place? PROCESS • Has a test execution process been defined? • Is the test execution process comprehensively managed? • Is a standard process documented thoroughly and widely distributed? TECH & TOOLS • Has a testing tool been implemented? • Is the testing tool fully integrated with an analytics platform? • Are features of the experimentation stack maximized? CULTURE • Is there acceptance of testing as a decision-making tool? • Is there a demand for testing and data? • Is testing innate and embraced at every level of the organization? PERFORMANCE • Are some tests being launched? • Are goals and success metrics for the program established? • Is test impact consistently measured?
Increasing your testing maturity • Accelerating testing output • Maximizing KPI impact • Improving customer experiences • Increasing data informed decisions • Scaling you program As you build and execute your testing program – focus on: Culture Team Process Tech & Tools Strategy Performance 6 Key Program Pillars
There are some common problems we see with each of the 6 pillars.
Problems with Culture • Teams do not understand what testing is or why they should test • Lack of motivation or excitement to test • No training or foundation in testing • Lack of community resources to support testing efforts • Limited or no engagement by Sr. Management Culture
What to do first Hold a Testing Summit Bring stakeholders together for a full or half-day event that includes case study reviews, training sessions, ideation roundtables, and guest speakers. Though it can take a lot of planning and coordination, such an event offers a valuable opportunity to kick start a stuck culture. Culture
Team Problems with Team • Organizations do not have a Center of Excellence or the right testing program structure • Roles/responsibilities are not clear • Not the right or not enough resources • Lack of skills • Not enough resources
TeamWhat to do first Institute a Training Program It would be great to supercharge an inexperienced team with fresh talent, but doing so is rarely practical. Instead, institute a regular training program—which could be as simple as weekly “brown bag” lunch programs or a regular reading group—to build skills among the team already available.
Process Problems with Process • Inefficient process for execution/lack of standardization • Communication challenges • Lacks clear and consistent documentation • Experiment design lacks rigor • Insufficient prioritization of tests • Results aren’t being shared
What to do first Document the Process If your process feels stuck or broken, it’s time to capture it and create a formal outline—or conduct a thorough review if such an outline already exists. The outcome should be a detailed RACI that defines roles and responsibilities from initial ideation to the final implementation of winning treatments. Process
Problems with Tech & Tools • Testing tools and analytics tools aren’t the right fit for organization needs • Tools are not integrated • Lack of tool capabilities • Individuals to not know how to use the tools • Utilization of tools is low and limited Tech & Tools
Tech & ToolsWhat to do first Audit the Tech Stack Though tech challenges can be the most costly and challenging to fix, there are usually simple improvements to improve the performance. Begin by outlining the stack and its components, looking for errors in integration, discrepancies in data, and missing metrics of interest. Once this information is compiled, a roadmap of fixes can be prioritized
Strategy Problems with Strategy • Strategy maturity is low • No process for developing testing ideas • Lack or limited use of data to support ideation • Teams lean heavily on either qualitative or quantitative data – often times not both • No roadmaps or not prioritized based on key considerations
What to do first Adopt an Ideation Methodology Weak test strategies typically stem from a lack of formal process. Instituting a methodology ensures data is used to inform ideas, reduces the number of opinion-based tests, and facilitates prioritization of test queues. Strategy
Problems with Performance • No testing goals for organization or self- service teams • Program performance across the organization is not being captured or monitored • Testing impact not being measured or consistently measured • Testing capacity not being maximized • Testing results/insights not being surfaced or used Performance
PerformanceWhat to do first Define Goals for Testing Formal goals for the testing program help to inform operational strategies, drive improvement and, perhaps most importantly, communicate the role and mission of the testing team across the organization. If you’re testing program is floundering, define a set of specific goals for the program right away.
Action Plan 1. Hold a testing summit 2. Institute a training program 3. Document the process 4. Audit the tech stack 5. Adopt an ideation methodology 6. Define goals for testing Culture Team Process Tech & Tools Strategy Performance 6 Key Program Pillars
Key Takeaways 1. We need to consistently take a 10,000 foot view of our testing programs 2. Tracking our growth across six pillars helps focus the effort 3. Identifying challenges and weaknesses within each pillar allows us to create effective growth strategies
Questions? Please submit your questions via the text box on your screen.
1200+Attendees 20+Sessions 100+Giveaways 30+Speakers Wynn Las Vegas, October 17-19 Registration Early Bird $750Expires on 09/01/2017 Early Early Bird $550Expires on 07/01/2017 Regular $850Expires on 10/16/2017
Thank you

Testing Your Testing Program

  • 1.
    Testing Your Testing Program Presenters ClaireSchmitt, Sr. Director Optimization Consulting, Brooks Bell David DeFranza, Director, Optimization Consulting, Brooks Bell Hudson Arnold, Strategic Partnerships, Optimizely
  • 2.
    Today’s Speakers Claire Schmitt SeniorDirector Optimization Consulting Brooks Bell David DeFranza Director, Optimization Consulting Brooks Bell
  • 3.
    Housekeeping • We arerecording • Slides will be available after the webinar is complete • There will be time to submit questions at the end of the presentation
  • 4.
    Agenda • Overview • TheBrooks Bell Maturity Model • Key questions to ask about your own program • Testing program problems + solutions • Q&A
  • 5.
    Brooks Bell isa premier experimentation consultancy offering world-class brands a flexible, consulting-based operational model. Our team of testing experts create custom, data-driven strategies that help companies solve problems facing their business. Who we are Program Advisement Experimentation Training Full Service Testing Resource Augmentation
  • 6.
    Building a testingprogram is hard Testing requires a lot of time and attention. As a result, testing managers tend to focus on daily tasks and not overall program performance.
  • 7.
    We get testsout the door, but... • Are we generating impact consistently? • Does each test contribute to a growing body of knowledge? • Is our process as efficient as it could be? • Are we consistently increasing the skills of our team? • Has participation across the organization increased? • Does data drive our decision-making?
  • 8.
    The Brooks BellMaturity Model
  • 9.
    Digital Experimentation Maturity Model 6 CriticalTesting Program Pillars Culture Team Process Tech & Tools Strategy Performance 6 Key Program Pillars
  • 10.
    Culture Team Process Tech & Tools Strategy Performance 6 KeyProgram Pillars Culture Categories • Demand • Buy-in • Motivation • Testing resources/support • Levels of engagement • Training/education Team Categories • Testing organization structure • Testing roles/responsibilities • Skillset • Resource availability • Executive sponsorship Team ProcessProcess Tech & Tools Strategy Performance Team Culture Process Categories • Experiment design • Execution process • Documentation • Communication • Reporting + sharing • Standardization • Test coordination • Queue prioritization Team Process Tech & Tools ProcessProcess Tech + Tools Categories • Tool stack • Data quality • Integration with existing systems • Tool feature/functionality • Tool utilization Tech & Tools Strategy Strategy Categories • Methodology • Data usage • Testing locations • Ideation maturity • Customer centricity • Journey • Iterative testing Performance Strategy Performance Categories • Goals • % capacity testing • Program measurement • Impact measurement • Results/Insights tracking
  • 11.
    Digital Experimentation Maturity Landscape 1 2 3 4 5 LEVEL1 • Testing program in infancy • Goals may not be defined • Testing org structure not established yet • No processes or standardization • Strategy not informed by data • Testing and analytics tools in place LEVEL 2 • Testing is generally accepted • Goals and metrics established • Org Structure in place, testing skills inconsistent • Basic processes in place • Strategy inconsistently informed by data • Testing and analytics tools integrated LEVEL 3 • People want to test and community is forming • Program focused on maximizing opportunities • Resource(s) are dedicated to testing • Standard, comprehensively managed process • All strategies informed by data • Some features of tools are being maximized LEVEL 4 • Testing informs majority of site decisions • Testing focused on impact • Cross-functional, experienced resources • Processes adopted and optimized for efficiency • All strategies informed by qual and quant data • Utilization of tool features maximized LEVEL 5 • Testing and data is innate • Testing impact and opportunities maximized • All individuals are versed in all aspects of testing • Processes is flexible for rapid deployment • Comprehensive consumer behavior strategy • Utilizing advanced features within tool stack
  • 12.
    Level 0 1 2 3 4 5 Getting toa level 1 is a true sign of success
  • 13.
    Evaluating your owntesting program
  • 14.
    We need tostop and ask ourselves... STRATEGY • Is there a standard process for generating test ideas? • Are test ideas typically informed by data? • Is the ideation process is widely adopted and consistently applied? TEAM • Is there basic knowledge of and experience with testing? • Are resources at least partially dedicated to testing? • Is an experienced, multi-disciplinary testing team is in place? PROCESS • Has a test execution process been defined? • Is the test execution process comprehensively managed? • Is a standard process documented thoroughly and widely distributed? TECH & TOOLS • Has a testing tool been implemented? • Is the testing tool fully integrated with an analytics platform? • Are features of the experimentation stack maximized? CULTURE • Is there acceptance of testing as a decision-making tool? • Is there a demand for testing and data? • Is testing innate and embraced at every level of the organization? PERFORMANCE • Are some tests being launched? • Are goals and success metrics for the program established? • Is test impact consistently measured?
  • 15.
    Increasing your testingmaturity • Accelerating testing output • Maximizing KPI impact • Improving customer experiences • Increasing data informed decisions • Scaling you program As you build and execute your testing program – focus on: Culture Team Process Tech & Tools Strategy Performance 6 Key Program Pillars
  • 16.
    There are somecommon problems we see with each of the 6 pillars.
  • 17.
    Problems with Culture •Teams do not understand what testing is or why they should test • Lack of motivation or excitement to test • No training or foundation in testing • Lack of community resources to support testing efforts • Limited or no engagement by Sr. Management Culture
  • 18.
    What to dofirst Hold a Testing Summit Bring stakeholders together for a full or half-day event that includes case study reviews, training sessions, ideation roundtables, and guest speakers. Though it can take a lot of planning and coordination, such an event offers a valuable opportunity to kick start a stuck culture. Culture
  • 19.
    Team Problems with Team •Organizations do not have a Center of Excellence or the right testing program structure • Roles/responsibilities are not clear • Not the right or not enough resources • Lack of skills • Not enough resources
  • 20.
    TeamWhat to dofirst Institute a Training Program It would be great to supercharge an inexperienced team with fresh talent, but doing so is rarely practical. Instead, institute a regular training program—which could be as simple as weekly “brown bag” lunch programs or a regular reading group—to build skills among the team already available.
  • 21.
    Process Problems with Process •Inefficient process for execution/lack of standardization • Communication challenges • Lacks clear and consistent documentation • Experiment design lacks rigor • Insufficient prioritization of tests • Results aren’t being shared
  • 22.
    What to dofirst Document the Process If your process feels stuck or broken, it’s time to capture it and create a formal outline—or conduct a thorough review if such an outline already exists. The outcome should be a detailed RACI that defines roles and responsibilities from initial ideation to the final implementation of winning treatments. Process
  • 23.
    Problems with Tech& Tools • Testing tools and analytics tools aren’t the right fit for organization needs • Tools are not integrated • Lack of tool capabilities • Individuals to not know how to use the tools • Utilization of tools is low and limited Tech & Tools
  • 24.
    Tech & ToolsWhatto do first Audit the Tech Stack Though tech challenges can be the most costly and challenging to fix, there are usually simple improvements to improve the performance. Begin by outlining the stack and its components, looking for errors in integration, discrepancies in data, and missing metrics of interest. Once this information is compiled, a roadmap of fixes can be prioritized
  • 25.
    Strategy Problems with Strategy •Strategy maturity is low • No process for developing testing ideas • Lack or limited use of data to support ideation • Teams lean heavily on either qualitative or quantitative data – often times not both • No roadmaps or not prioritized based on key considerations
  • 26.
    What to dofirst Adopt an Ideation Methodology Weak test strategies typically stem from a lack of formal process. Instituting a methodology ensures data is used to inform ideas, reduces the number of opinion-based tests, and facilitates prioritization of test queues. Strategy
  • 27.
    Problems with Performance •No testing goals for organization or self- service teams • Program performance across the organization is not being captured or monitored • Testing impact not being measured or consistently measured • Testing capacity not being maximized • Testing results/insights not being surfaced or used Performance
  • 28.
    PerformanceWhat to dofirst Define Goals for Testing Formal goals for the testing program help to inform operational strategies, drive improvement and, perhaps most importantly, communicate the role and mission of the testing team across the organization. If you’re testing program is floundering, define a set of specific goals for the program right away.
  • 29.
    Action Plan 1. Holda testing summit 2. Institute a training program 3. Document the process 4. Audit the tech stack 5. Adopt an ideation methodology 6. Define goals for testing Culture Team Process Tech & Tools Strategy Performance 6 Key Program Pillars
  • 30.
    Key Takeaways 1. Weneed to consistently take a 10,000 foot view of our testing programs 2. Tracking our growth across six pillars helps focus the effort 3. Identifying challenges and weaknesses within each pillar allows us to create effective growth strategies
  • 31.
    Questions? Please submit yourquestions via the text box on your screen.
  • 32.
    1200+Attendees 20+Sessions 100+Giveaways 30+Speakers Wynn Las Vegas,October 17-19 Registration Early Bird $750Expires on 09/01/2017 Early Early Bird $550Expires on 07/01/2017 Regular $850Expires on 10/16/2017
  • 33.