An interview with an Agile Java Developer working for a global media company that has used Traffic Parrot for a number of years now.
1. What caused you to look for a service virtualization and API mocking solution?
I was working as an Agile Java Developer on a cross-functional team. The team was used to high levels of automation and testing, using techniques such as TDD daily. It was common practice at the company to use both in-memory and over-the-wire mocks to facilitate automated and manual testing.
2. How was mocking handled previously? How would you define the specific challenges / limitations with this approach?
Before adopting Traffic Parrot technology, the mocks we used on this project were all in memory mocks using libraries such as Mockito. This was sufficient to write unit-level tests but did not allow us to write over-the-wire integration/acceptance tests or enable our manual tester to test microservices in isolation.
3. Was there any urgency to find a solution? What was driving that urgency?
Yes, in the sense that this was part of the existing approach the team took to testing. It became urgent at the point that we needed to write integration/acceptance tests to help demonstrate functionality to the business analysts on the team.
4. How did you go about trying to solve the problem prior to implementing a solution?
We lived with the problem; we didn’t write some tests and didn’t do some manual testing with mocks.
5. Which resources were most useful in creating a vendor list? Least useful?
Google searches and word of mouth were enough; other resources were not explored.
6. How did you discover Traffic Parrot/Mountebank/Wiremock/…?
Google/GitHub/word of mouth referral from other team members.
7. What was compelling enough about Traffic Parrot that led you to evaluate?
The ability to use the same mocks (via shared configuration files) for both automated and manual testing, as well as the presence of a simple UI to configure/enable the mocks as required.
8. Which vendors did you consider? Why?
WireMock was also considered, but it lacked a UI so fell short of being useful for our manual tester on the team.
9. Can you briefly describe the team's decision making process?
The fastest route to a working solution wins. Something that satisfies each role on the team: developers, testers, business analysts.
10. What role did you play in the project?
Agile Java Developer, writing production/test/infrastructure code.
11. What were your most important evaluation criteria?
Programmatically configurable APIs, request matching, request/response monitoring.
12. Who was the ultimate decision maker?
All roles on the team needed to be satisfied with the decision.
13. What was the single most important reason for the win/loss?
Time to solution.
14. How would you articulate Traffic Parrot's value proposition?
Test in isolation, infrastructure as code, share mocks between development/testing/across teams.
15. What do you feel were the key strengths and weaknesses of Traffic Parrot?
Strengths: customer support is excellent, range of configurations from UI driven to code-driven is very helpful
Weaknesses: lacking an extensive library of templates and code examples to get started faster
16. Which vendor did you like the most and why?
Traffic Parrot because it matched our expectations, we didn’t hit any blockers when implementing a solution.
17. Which vendor did you like the least and why?
Other open source vendors without Java language bindings would have had to write our own Java bindings for our test mock usage.
18. Can you tell me about pricing? How big of factor did pricing play into your decision?
Pricing was handled 2 layers above my role, I did not consider pricing in my own decision.
19. What did you like / dislike about our sales and/or customer success or support efforts? How would you compare your evaluation experience with us compared to other vendors?
The support we received was much more timely and helpful than I have experienced with other vendors, including open source. The feedback we provided made it’s way into the product within days!
20. What could Traffic Parrot have done better?
More brand visibility, we only found out about Traffic Parrot by word of mouth.
21. How likely is it that you would recommend Traffic Parrot to a friend or colleague on a scale of 0-10 (0=not likely at all, 10=extremely likely)?
10 - when you need a UI, programmatic APIs, infrastructure as code and shared mocks.
22. What’s the one word that pops into your mind when you think of Traffic Parrot?
Configurable.
23. How many people use Traffic Parrot at your company?
5
24. What are your favourite features or capabilities with Traffic Parrot?
Infrastructure as code, ability to version control mock configuration.
25. What’s the business value you get from Traffic Parrot?
Supports cross-functional teams' ability to move fast.
26. Were there any unplanned wins? (e.g. capabilities or use cases that you didn't envision during the initial purchase but later realize would be valuable to you?
We also realised we could use the same technology to provide mocks to other departments, to decouple our development schedules and interoperate via mock API specifications.
27. What does the future look like? How do you plan to evolve and grow with Traffic Parrot?
We are trying to share our experience with other teams and encourage wider adoption. However, the company culture is such that each team tends to choose their own tooling and are often resistant to recommendations from other teams.