The reality for many enterprises trying to pivot quickly is they are dealing with significant legacy systems that cannot be replaced easily. They must continue to surface and manipulate rich customer data and their history of transactions in order to compete for and win new business.
This is a story of how a company’s investment in Test Automation has allowed it to shift the dial on speed to outcome. There were even some unintended consequences that continue to reap great benefits for the organisation and its people three years on.
The Problems
At the outset the problem statements were clear.
Rapidly changing Member expectations
Increasing Intensity of competition
Ever growing products and services
Long tail test cycles delaying product release
Defect detection late in delivery cycle delaying customer feedback
Quality issues impacting the customer journey
The Opportunities
The challenge was to support the enterprise to deliver the right features to our customers sooner. Our experience of earlier implementations led us to establish three key enablers which became our strategy for success over time.
The Gap
There were a large number of manual tests across core business applications which took many weeks to run and delayed releases of business features to customers. The test cycle was costly and additional manual testers needed to be onboarded if we wanted to decrease our time to value. The ROI of new or enhanced products was impacted due to the rising cost of quality.
Focus on the Business Value
Obtaining funding for business enabling initiatives can be difficult. Building a business case around the Automation Framework itself is not the answer. The business case should solve real business problems and should be articulated in terms of business goals, outcomes, benefits and metrics to ensure we have a clear value proposition. I have found Melissa Perri’s Product Strategy canvas very useful to define our value proposition.
An overall goal might be:
Vision: "In 12 months Company X will be able to release new and enhanced products to the market quickly and safely"
A shorter term objective to realise value sooner could be to break the work down as follows:
Challenge: In order to reach our vision, we need to decrease the feedback cycle of regression tests results from more than five days to less than 20 min by X date
Short term objective: Regression testing of critical business process X takes 5 minutes and can be automatically triggered at any time of day.
Current State: 50 manual tests takes one person three business days to execute.
Working with your business stakeholders to select a critical business process to focus on ensures your are prioritising the protection of the highest value customer journeys.
Create a valuable cross-functional team
From the outset we had a goal to build sustainable feature tests that enabled the organisation to quickly adjust to feature changes or additions over time. In order to achieve this, we needed the correct mix of skills. We designed the tests which automated the highest value parts of the customer journey. We coupled that with strong developer experience that focussed on building and treating tests as production code which could be easily and cost effectively maintained over time. Fortunately we were able to co-locate a team of business subject matter experts (SMEs), testers and developers together to achieve this.
This exercise had additional benefits that we didn’t anticipate. Business SMEs learnt a whole lot more about the process of designing and building tests. Testers learnt that code isn’t as daunting as they thought and in fact they could read well designed code. This encouraged testers to collaborate a lot more with developers on functional tests. Developers learnt that there is a real art to page modelling and predicting how to model based on current and future feature changes. They also improved their ability to design valuable functional tests further growing their t shaped skills.
Close collaboration ensured the team did not make assumptions which fields were most valuable capture in the automated test and that they had the skills readily available to unblock, avoid assumptions and progress on test creation quickly reducing the overall investment required to deliver the test assets for the organisation. The business subject matter experts were elated that they no longer needed to be available for late night and week-end regression testing or production verification testing(PVT) and the product managers increased their confidence to test and learn on new features at anytime knowing they had their quality safety net in place.
Create a small suite of high value tests first
Following an MVP (minimal viable product) style approach to learn and reinforce value was essential to ensure we were able to obtain further initiative funding. In this context, this meant we created an FVT (functional verification test) suite of 15 of cross-cutting tests that quickly validated critical customer journeys across multiple systems. This provided many insights:
Proved to our business that we could automatically validate a process end to end, including the creation of the customer data with linking to their associated policies in a timely way;
Enabled the team to commence page modelling without needing to build out the full design up front. They modelled just enough to be valuable quickly;
Provided an immediate application up time health check that could be scheduled to run at any time of the day. This proved extremely useful for the wider Environment team who were often working blind when other feature teams reported environment downtime;
Served as a set of valuable environment shakedown tests which went further then end point testing to ensure core application functionality was working as expected when new test environments were commissioned in the cloud;
Captured feedback on the difficulty of testing the applications at the highest level (fragility) and exposed the real environmental and architectural issues which were being passed on to the end users unknowingly until that point. We now had conclusive repeatable metrics around areas of concern.
Address Scale and Speed
With success comes demand and we rapidly needed to address the ability to execute hundreds of tests to support fast feedback. We were able to solve this with a combination of the Cloud and a pool of virtual build machines which enabled us to scale out the execution of tests on demand as required. This proved to be a very cost effective scaling solution. We could automatically commission a new virtual build machine when the demand was high, meaning we only paid for what we needed when we needed it.
Not only did we need to scale the technology but we also needed to scale our ability to develop tests. We needed to grow our development team. We adopted a mentoring model using recent or previous IT graduates who were prepared to collaborate, learn and grow. Our senior development lead paired with graduates to introduce them to sustainable development practices and how that applied to the code base and models we had in place. With this approach we had new team members up and running in hours and writing sustainable tests in days.
Safety
Under a manual testing model, testers found it difficult to reproduce defects easily. The repeatability and consistency inherent in the automation model allowed a much richer suite of tests which explored many of the non-functional tests that had been ignored previously, due to their inability to recreate or duplicate the scenarios quickly enough for feedback. We learnt a great deal about the applications under test using higher performance conditions. The increased test coverage further reinforced the business areas to feel comfortable to make changes at speed, knowing they were safe to do so.
The Unexpected Benefits
Below is a list of the additional benefits we discovered during the journey to speed idea to value.
Automated Test data generation: When we completed the current state value stream mapping for the existing test process we discovered that 70% of time spent on tests was the data setup. In tightly coupled legacy systems we needed to be able to generate a unique member in one system, take out a policy in another system and then process a claim in another. In some cases the area of the process where we had designed a test may have been somewhere near the end of this process. The manual testers could take up to a day to get the scenario setup so they could execute their tests. We quickly created a test data tool where testers could choose the number and type of members, policies and claims they required and in which step of the lifecycle of a policy or claim they needed it.
Process Automation bootstrapping: While introducing test automation we also saw the rise of tools to support automatic business processing. Rather than invest in another suite of tools and people we were able to demonstrate the re-use of many of the existing page models developed for test automation to then also automate these manual processes in production. We could deliver the process automation required in days rather than weeks. An excellent repurposing of our initial test automation investment to make many efficiency improvements for day to day labour intensive business operations processes.
Environment Masking: My favourite improvement was our ability to remove the masking of production data. We spent weeks trying to copy back production data into test environments to support testers to work with production like data to perform some of their exploratory tests. Copied data also needed to be masked to protect the privacy of individuals. Once we had the automatic data creation tool working across systems we could challenge the need to spend weeks copying back data and then going through the tedious masking process. What if we didn’t need to copy it back at all and could generate all the data that testers needed across all the environments they needed? We did this by working with the development teams to separate the application from the data it needed, scripting the installation and configuration of the application and then automatically loading the required data into the systems. Success. We trialled this approach side by side for a few releases to valid the hypothesis that we could automate the creation of the environment, application and the data it needed and then quietly retired the need for masking.
Conclusion
Similar to many endeavours our key to success really was starting small and iterating to learn and improve. By retaining the linkage of speed, safety and value as our guide rails, we were able to add value at each increment and keep our business stakeholders and ultimately our sponsors engaged throughout.
コメント