Tuesday 26 July 2022

Interview: Traffic Parrot API mocking helped an Agile team with automated testing

An interview with an Agile Java Developer working for a global media company that has used Traffic Parrot for a number of years now.

1. What caused you to look for a service virtualization and API mocking solution?

I was working as an Agile Java Developer on a cross-functional team. The team was used to high levels of automation and testing, using techniques such as TDD daily. It was common practice at the company to use both in-memory and over-the-wire mocks to facilitate automated and manual testing.

2. How was mocking handled previously? How would you define the specific challenges / limitations with this approach?

Before adopting Traffic Parrot technology, the mocks we used on this project were all in memory mocks using libraries such as Mockito. This was sufficient to write unit-level tests but did not allow us to write over-the-wire integration/acceptance tests or enable our manual tester to test microservices in isolation.

3. Was there any urgency to find a solution? What was driving that urgency?

Yes, in the sense that this was part of the existing approach the team took to testing. It became urgent at the point that we needed to write integration/acceptance tests to help demonstrate functionality to the business analysts on the team.

4. How did you go about trying to solve the problem prior to implementing a solution?

We lived with the problem; we didn’t write some tests and didn’t do some manual testing with mocks.

5. Which resources were most useful in creating a vendor list? Least useful?

Google searches and word of mouth were enough; other resources were not explored.

6. How did you discover Traffic Parrot/Mountebank/Wiremock/…?

Google/GitHub/word of mouth referral from other team members.

7. What was compelling enough about Traffic Parrot that led you to evaluate?

The ability to use the same mocks (via shared configuration files) for both automated and manual testing, as well as the presence of a simple UI to configure/enable the mocks as required.

8. Which vendors did you consider? Why?

WireMock was also considered, but it lacked a UI so fell short of being useful for our manual tester on the team.

9. Can you briefly describe the team's decision making process?

The fastest route to a working solution wins. Something that satisfies each role on the team: developers, testers, business analysts.

10. What role did you play in the project?

Agile Java Developer, writing production/test/infrastructure code.

11. What were your most important evaluation criteria?

Programmatically configurable APIs, request matching, request/response monitoring.

12. Who was the ultimate decision maker?

All roles on the team needed to be satisfied with the decision.

13. What was the single most important reason for the win/loss?

Time to solution.

14. How would you articulate Traffic Parrot's value proposition?

Test in isolation, infrastructure as code, share mocks between development/testing/across teams.

15. What do you feel were the key strengths and weaknesses of Traffic Parrot?

Strengths: customer support is excellent, range of configurations from UI driven to code-driven is very helpful

Weaknesses: lacking an extensive library of templates and code examples to get started faster

16. Which vendor did you like the most and why?

Traffic Parrot because it matched our expectations, we didn’t hit any blockers when implementing a solution. 

17. Which vendor did you like the least and why?

Other open source vendors without Java language bindings would have had to write our own Java bindings for our test mock usage.

18. Can you tell me about pricing? How big of factor did pricing play into your decision?

Pricing was handled 2 layers above my role, I did not consider pricing in my own decision.

19. What did you like / dislike about our sales and/or customer success or support efforts? How would you compare your evaluation experience with us compared to other vendors?

The support we received was much more timely and helpful than I have experienced with other vendors, including open source. The feedback we provided made it’s way into the product within days!

20. What could Traffic Parrot have done better?

More brand visibility, we only found out about Traffic Parrot by word of mouth.

21.  How likely is it that you would recommend Traffic Parrot to a friend or colleague on a scale of 0-10 (0=not likely at all, 10=extremely likely)?

10 - when you need a UI, programmatic APIs, infrastructure as code and shared mocks.

22.  What’s the one word that pops into your mind when you think of Traffic Parrot?


23.  How many people use Traffic Parrot at your company?


24.  What are your favourite features or capabilities with Traffic Parrot?

Infrastructure as code, ability to version control mock configuration.

25.  What’s the business value you get from Traffic Parrot?

Supports cross-functional teams' ability to move fast.

26.   Were there any unplanned wins? (e.g. capabilities or use cases that you didn't envision during the initial purchase but later realize would be valuable to you?

We also realised we could use the same technology to provide mocks to other departments, to decouple our development schedules and interoperate via mock API specifications.

27.  What does the future look like? How do you plan to evolve and grow with Traffic Parrot?

We are trying to share our experience with other teams and encourage wider adoption. However, the company culture is such that each team tends to choose their own tooling and are often resistant to recommendations from other teams.

Monday 11 July 2022

Service virtualization in ephemeral containers

Product teams that move fast would like to have more control over the infrastructure of the tools they run. They would also like to run the service virtualization the same way they run their microservices and other tools, in ephemeral containers that they spin up based on demand.

This is a high level comparison of the two options available, the traditional approach of using service virtualizatino managed by a central team and the new approach that is in line with the new industry tremds.

Traffic Parrot supports the new model and teams that want to migrate from the traditional model to the new model.

Centrally managed service virtualization (i.e Center Of Excellence)

Self-service ephemeral service virtualization (also called API simulation)

Management of the SV tool servers

Typically central team of administrators (COE)

The product team that needs the virtual services

Purchasing of SV tool licenses

Typically central team of administrators (COE)

The product team that needs the virtual services

Creation of virtual services

The product team that needs the virtual services

The product team that needs the virtual services


Typically long-lived 24/7 running instances

Typically ephemeral instances spin up based on demand

Development technology

Typically VMs

Typically ephemeral containers

Deployment architecture

Typically a few shared instances of service virtualization environments

Many ephemeral instances running on local laptops, inside CI/CD pipelines, in OpenShit, etc.

Wednesday 6 July 2022

Service Virtualization As Code (API Mocking As Code)

Traffic Parrot has first-class support for the service virtualization as code pattern (also called API mocking as code).

If you have any state or configuration you manage in your IT infrastructure the best solution in most cases is to version control it in a source control system like Git. For example, if you are running on Kubernetes and Docker, your whole infrastructure might be defined in a source control repository as Dockerfiles and Terraform Kubernetes configuration files. Its called Infrastructure as code

It is advised to do the same with your API mocks and virtual services. All your mocks and virtual services should be stored in a version control system such as Git. In the case of Traffic Parrot, this is possible since all request to response mapping files are stored on the filesystem as JSON files. Alternatively, you can use the JUnit TrafficParrotRule directly in your JUnit tests. This way you are taking your automation to the next level, and you have a fully automated and reproducible build pipeline.

Because of the issues that can arise with manual processes, avoid having API mocks and virtual services that are updated manually and never stored in Git or a similar source control system. Store all your API mocks and virtual services in a source control system.

The business justification (quoting Wikipedia): "Infrastructure automation enables speed through faster execution when configuring your infrastructure and aims at providing visibility to help other teams across the enterprise work quickly and more efficiently. Automation removes the risk associated with human error, like manual misconfiguration; removing this can decrease downtime and increase reliability. These outcomes and attributes help the enterprise move towards implementing a culture of DevOps".