Monday, 30 March 2020

Bestow has used Traffic Parrot gRPC mocks to deliver features faster to customers

After a thorough evaluation, Bestow Inc. selected Traffic Parrot's service virtualization and API mocking tool in April 2019 for their application development needs. In this case study, we will look at the details of their infrastructure, how they applied Traffic Parrot, and what issues they have come across.
  • Traffic Parrot is specifically designed to maximize the productivity of developers writing automated tests and to enable them to mock out microservices for local development. Their lightweight platform with gRPC support was a good fit for our Docker and Go-based development environment. They provided strong support during the POC and continue to track the rapid evolution of gRPC, acting as an effective extension to our team.
    Brian Romanko, VP Engineering at Bestow

Introduction

Bestow has challenged industry assumptions with a new underwriting framework that provides affordable term life insurance in minutes instead of weeks. They use Traffic Parrot to unblock teams and allow them to work independently. Bestow uses Traffic Parrot gRPC mocks in their microservice CI regression testing suites to detect breaking changes in their microservice APIs.

Technology stack: Docker, GoLang and gRPC

The core technology they rely on includes:
  • Container-based infrastructure, running Docker in Kubernetes on GCP
  • Microservices in a variety of languages, including GoLang and Python
  • Microservices communicate using gRPC APIs, with API contracts defined in Proto files
Bestow colocated teams developing a microservice to encourage close communication. gRPC APIs connect microservices, which are sometimes owned by different teams. Bestow designs gRPC APIs using Proto files, which form the contract between microservices.

Problem: teams are blocked waiting for APIs

Starting more than a year ago, Bestow developed multiple microservices in parallel. For example, the Policy Administration team provided gRPC APIs for the Enrollment team to consume. This meant that developers on the Enrollment team were sometimes waiting for the Policy Administration team to deliver their microservice APIs before they could start working.
This led to blocked timelines between teams, which meant Bestow could not deliver at the fast pace required for their customers. It was urgent for Bestow to find a solution to allow the teams to work independently.

Solution: decouple teams by using gRPC mocks

Traffic Parrot was identified as a candidate for a gRPC API mocking solution that could help unblock the timelines between the teams at Bestow. After a two week technical evaluation by VP of Engineering Brian Romanko, it was clear that the open-source alternatives did not provide adequate capabilities and Traffic Parrot was chosen to fulfil Bestow development needs.
Teams at Bestow use Traffic Parrot to develop both sides of their gRPC APIs in parallel, without having to wait for the server code to be written before a client can be tested. They run automated test suites on their CI build agents, with Traffic Parrot running in a Docker container on the agent.

Wednesday, 25 March 2020

How to choose a service virtualization tool?

Most companies like to evaluate several tools before they commit to a purchase.

Typically they evaluate the service virtualization tools based on many factors such as:
  • Cost
  • Protocols and technologies supported
  • Features
  • Performance benchmarks
  • Support level

Here are a few additional technical questions might help decide which of the tools you are looking at is best:
  • Would you like to have a central team of administrators managing the new tool?
  • What kind of footprint would you like (RAM, disk usage, ...)?
  • What kind of licensing model would work best for your use case?
  • Do you need to source control of the virtual services and deployment scripts?
  • Are you looking for a tool that fits more the microservices architecture or a monolithic architecture?
These questions are based on: