Saturday, 7 October 2017

Service virtualization and manual testing in Agile teams, is it still necessary?

This is a story about a time I have consulted for a department at a large multinational enterprise. There were around 80 people working there on delivering software products for internal use.

The management has implemented the Extreme Programming methodology, a type of agile software development, and used it for more than ten years. They had more than one hundred middleware applications (microservices) exposing APIs to other departments. Each application was tested by anywhere between hundreds to thousands of different types automated tests after every code commit.

Despite those high levels of automation and 83-98% automated test code coverage (depending on the application), every team was still manually exploratory testing their software, with the help of service virtualization. Let us explore how the management justified the value of manual exploratory testing and using service virtualization in that type of environment, and how has that been implemented.

Their software delivery lifecycle

Developers would pick up a new story to work on, and start with writing acceptance tests. They would discuss the acceptance tests contents with the business and the testers working in the team. They would then proceed to write unit tests and production code.

Once the user story has been implemented and was working as expected according to developers expectations, the testers would test the application manually. Quite often they would find unexpected behaviour that the developers did not anticipate. They would discuss with developers if immediate code changes were required or a new story needed to be raised to tackle the task shortly.

They would also find serious bugs once every 2-4 user stories. The costs of the bugs found leaking to production would be so high that it easily justified having an extra person on every team, a tester performing manual exploratory testing.

Why is service virtualization useful?

A tester while manually exploring how the application behaves would simulate different types of hypothetical responses from third party and backend applications. It was often because of those hypothetical scenarios that the bugs were found. The more backend or third party systems were involved in one transaction, the more likely permutations of different non-typical responses would result in unexpected system behaviour.

Often, those test scenarios also included valid responses that have not been covered by acceptance tests by developers. Developers are also people, so they sometimes missed obvious scenarios or made simple mistakes.

So, even though the developers have thought they had implemented everything well enough, and the new functionality can go to production as is, testers would make another judgment call at what else is likely to happen in production environments. That resulted in new stories or bug fixes being raised.

After testing in isolation, the tester always performed manual integration testing with backend and third party services.

Service virtualization tools used

All this would not be possible without using a service virtualization tool fitted well for manual exploratory testing in agile environments. Since there were no appropriate tools available on the market at that time, this particular organisation has decided to create one in-house. They have spent 14 days over a period of 9 months to develop and perfect a tool for that team. That tool was specific only to the applications they have been developing.

The developers have used an open source tool called Wiremock that was used in their acceptance tests. They have built a GUI on top of Wiremock with several extensions to allow for better usability for manual exploratory testing.

Because the developers were using Wiremock it in their acceptance tests it was natural for the  testers to import the same virtual services (or stubs, as they called them) to their GUI tool. Using the same base technology proved to be very efficient.

Lessons learned on custom-built tools

The technical leads have noticed that they had five teams that have developed very similar tools to perform similar service virtualization and stubbing tasks. That totalled up to around 80 man days of development resources. That was an inefficient use of resources.

An improvement on that strategy would be to use an off-the-shelf tool instead of creating new tools in-house for every new project, provided it exists. It would help hit the ground running, save time and costs, and reduce risks of bugs in the custom-built tooling.

Benefits of a lightweight tool

At Traffic Parrot, we have taken all of those experiences into account while developing a new service virtualization and API mocking tool for Agile teams called Traffic Parrot. It is a tool that provides powerful service virtualization and API mocking capabilities but stays flexible and lightweight enough to be used in highly Agile teams by both developers and testers. Thanks to its many advanced features it can be also used in less Agile environments, while transitioning to the new delivery process. It can then be very cost effective.

Summary

We have learned that manual testing can be valuable even in an environment where there is a lot of automation. We had a look at why service virtualization is key to effective manual exploratory testing in highly Agile environments. We also explored how important is the use off-the-shelf open source or commercial tools fitted for Agile teams, and listed two tools worth trying out.

Next steps

If you have a lot of automated tests but once every other release experience costly production bugs go ahead and investigate if manual exploratory testing could help to address that problem. You do not have to hire people to run this small experiment. You can start by having existing team members that did not work on a given new piece of functionality put on a “testing hat”. They can test the application pretending they do not know how to code, as a tester would.

If you are not using stubbing, API mocking or service virtualization yet in your testing but would like to reduce the number of bugs, and the risks, give Traffic Parrot try.  Get one of your tech leads to look at Traffic Parrot free community version download at https://trafficparrot.com.

No comments:

Post a Comment