We have just released version 5.37.8. Here is a list of the changes that came with the release:
Fixes
- Added support for protoc on Apple ARM based M1/M2 processors running inside of Docker
We have just released version 5.37.8. Here is a list of the changes that came with the release:
A number of Traffic Parrot customers use the simulation tool across their whole enterprise as the primary and recommended API, system and service simulation tool.
We often get asked what would a large-scale rollout like that look like. We have created a high-level project overview that captures the process we have seen work well for our existing customers: API and Service Simulation Tool Implementation Project Plan
A significant number of Traffic Parrot customers run our tool in OpenShift. This blog post discusses a template process you can follow to do the same.
Here’s a quick terminology review to get us on the same page:
If the end goal is to deploy Traffic Parrot in OpenShift, you need the following:
There will be several steps to deploy things in OpenShift that we need to go through:
We have just released version 5.37.7. Here is a list of the changes that came with the release:
An effective OKR (Objectives and Key Results) for a Director of Quality Assurance (QA) should include the following attributes:
As a QA Manager in 2023, keeping track of your team's performance and ensuring they are meeting company goals is essential. One way to do this is by implementing OKRs (Objectives and Key Results) and KPIs (Key Performance Indicators) for your team.
OKRs are a management tool that helps to align your team's goals with the company's objectives. They consist of an objective, the goal you want to achieve, and key results, which are the measurable outcomes that show progress towards that goal.
On the other hand, KPIs are metrics that help you track your team's performance. They can measure anything from the number of bugs found per sprint to the time it takes to complete a task.
To implement OKRs and KPIs effectively, it's essential to follow a few key steps:
By implementing OKRs and KPIs, you can ensure that your team is aligned with the company's objectives and track their performance towards achieving them. This will help you identify improvement areas and ensure that your team is working effectively and efficiently.
As a Head of QA, it's essential to measure your team's performance and ensure that your organization's products meet the highest standards of quality. Here are three KPIs that are commonly used in QA departments to track and measure quality:
By tracking these KPIs and continuously working to improve them, you can ensure that your organization's products meet the highest standards of quality and improve customer satisfaction.
If you create APIs for your partners and customers, you might need API mocks and simulators to help them onboard faster and with fewer issues. In this video, we describe the options you have to provide those mocks and simulators to your user base.
We have just released version 5.36.4. Here is a list of the changes that came with the release:
trafficparrot.virtualservice.handlebars.now.provider=HANDLEBARS
trafficparrot.virtualservice.handlebars.now.provider=WIREMOCK
trafficparrot.virtualservice.handlebars.now.dynamic=true
{{now offset='2 years' format='epoch' provider='WIREMOCK'}}
{{now format='short' provider='HANDLEBARS'}}
We have asked Anatoli Iliev about his experience building high-performance teams. He is a Software Engineering Manager with a decade of experience building teams for companies such as Infragistics, SumUp and VMware.
Wojciech:
What obstacles would typically, in your experience, prevent the team from delivering high-quality software?
Anatoli:
Firstly, I would like to start with a quick disclaimer. I'm expressing my own opinion here, and that opinion is not related to anyone, any company or any physical person with that I have a connection to.
In terms of obstacles, your question was very good. What are the obstacles that prevent a team from delivering high-quality software? I think that there are probably a few obstacles and they can be all together or they can be alone one by one. But the one thing that's probably most influential on the quality of the software that the team is producing is the team culture. I really think that the team should have an open culture, a culture which allows every single individual to be themselves and to express their feelings, their concerns, their comments and so on.
This should happen in this so-called healthy environment so people can be themselves without hurting others. I can say that the team that works as a single thing is surely more successful than individual contributors themselves. I can give you an example. No matter how good the football player could be, like Ronaldo or Messi, a single Ronaldo or Messi cannot be better than a whole team playing against themselves. That's why the team spirit is most important and the team culture and team values. I think that's the most important thing. However, there are other factors that are impacting the quality of the software that's being delivered. I have seen teams and companies, organizations that are changing their priorities all the time. They are changing their priorities not every day, but twice or three times a day for more than months. And that's affecting the high quality and the delivery of software.
And of course, we have one more factor that's quite important, I have seen this, the lack of experience in newly formed teams. So when you have a newly formed team, you have individuals clashing with each other, trying to dominate, trying to find their place. And this is preventing teams from delivering high quality software. I can say. So as long as the team members know that there is a place for everyone and everyone is happy with this place, this is helpful for the quality of delivery.
Wojciech:
When you talk about these personalities or egos clashing and people trying to find their place, and you say one way you've attacked that was to in some ways highlight that there's a place for everyone in this team. Any, anything specific that you have done that people could use?
Anatoli:
I have seen situations where people were nominated to lead, but for one reason or another, they were not able to do it properly. The interesting thing is that these people always know that about themselves. It is a matter of open and honest communication to settle this and readjust things in a way that everyone feels more effective and happier. I can say. That's not always the most comfortable position for everyone. Sometimes people need to step out of their comfort zone to unleash their full potential. But that's more of a sense. A leader is gathered by lots of communication, honest and frank communication with my team. The one thing that's really helpful for me is to be fully transparent to everyone. This helps me set the team in the proper configuration, I can say.
Wojciech:
And do you use any traditional Scrum ceremonies to apply full transparency, or how do you handle conflicts when they arise? Would you just approach team members and say let's talk, or do you wait for a specific meeting?
Anatoli:
I am using all those Scrum ceremonies, in fact. But for the question you are asking, it's more
about the current situation. Sometimes there is a need for a deep technical conversation with arguments on what approach the team should take. If you have two strong individuals arguing on that, it's good to invite the whole team and have this deep technical conversation. On the other hand, there are situations where these two strong individuals are arguing about things that are more on the personal side, like character approach, and so on."
If you have in the team someone who has been there for a long time and they are performing very well and so on, they know that they are the edge of the team, let's say, in a way. And you have a rising star in the team that is quite pushy and trying to make their way, this can make the person who is on the edge feel a bit more pushed and so on in a personal matter. They can start arguing and be defensive for every single proposition made in a technical proposition, maybe by the other person. So this is something that's handled more on one-to-one conversations. And I can say that different from the previous example is that one-to-one conversations are the key here. And, of course, this needs to be done open and frankly.
Wojciech:
So the big thing is team culture, and I said that transparency and authenticity drive me towards building that team culture, an open culture. Anything else that comes to mind is that it's important to have clear communication and expectations within the team?
Anatoli:
As I mentioned, people should know and feel comfortable with who is who in the team. That's another big thing. They should be themselves and accept others. It takes some time for this to happen, but once you have this culture properly set in the team, it becomes easier for every next person joining the team. That makes sense. At the beginning, it takes more effort to do.
Wojciech:
Once you've got this team culture built and you've got this team that's not storming anymore that might be performing, you can talk about looking at the metrics to drive continuous improvement of the software delivery process. What's your experience with that?
Anatoli:
Yes, that's exactly the case. You need proper metrics to measure what you are doing and how you are progressing. And there are a few things that are equally important. I think that I can summarize that these are three things: customer satisfaction, team velocity, and product quality. These three things are equally important. You cannot have one over the others in any matter. Customer satisfaction is important, but you can have a happy customer without having a team performing at their best. So this way you are leaking resources, and you can be even better if you optimize the team and increase efficiency. On the other side, you can have good team velocity, but with low quality. This will affect customer satisfaction and future deliveries. Because for example, your architecture is not well defined and you cannot build on top of leaky foundations.
Wojciech:
So do you have any examples of how you actually measured those three? So customer satisfaction, what did you look at? For example, if you have a mobile app, you can look at the app store ratings or how many tickets you get for a specific feature. Have you had anything specific you could share?
Anatoli:
Well, I'm a big fan of proactive information gathering. I prefer direct communication with customers and users so they can provide their view. I'm a big fan of having feedback buttons or forms that are easy to reach, not too noisy or annoying, but easy to reach. So people can leave their view with just one or two clicks. I really like the approach that Zoom has. After each Zoom call, you have either a thumbs up or thumbs down and it's one click customer feedback. But I'm sure that means a lot. So that's something I strive to have in my products.
Wojciech:
Having an easy and low barrier for providing feedback is important for getting more responses and gathering more information. At the same time, it's automated enough that you can process and analyze the feedback at scale. And yes, based on the feedback provided, I also directly communicate with customers to gather more information and get a deeper understanding of their needs and concern.
Wojciech:
How do you measure team velocity and can you explain more about it?
Anatoli:
Team velocity is a way to measure the amount of work a team completes in a given period of time, such as one sprint, two weeks, or one month. The team should have their own way of estimating work, and over time they should become accustomed to it. This makes it easy to understand how much work was done in a specific period. It's important to note that work should be considered either fully done or not done at all, and not in a half-done state. To accurately measure velocity, it's important to break down work into small, meaningful chunks for the user. This takes time, but eventually it results in a fine-grained velocity that can be accurately measured.
Wojciech:
And how about quality? So how would you measure quality?
Anatoli:
Measuring quality can be relatively straightforward once you have effective communication with your users. One way to measure it is by tracking the number of defects or issues that arise over a certain period. These can be grouped by impact or severity to gauge the level of pain they cause for customers. However, it is important to note that this measurement is only effective when customers are engaged and invested in providing feedback. Without their engagement, it can be difficult to get an accurate picture of the quality of your product.
Measuring the time for reaction to critical customer issues is an important metric for my teams. We measure how quickly we can resolve these issues, which helps improve team culture and overall performance.
Too often, our clients reach out to us seeking API simulation help, and during initial calls, we discover they also need help with transformation and technology adoption side of the project. We see that they approach the Agile and DevOps transformation in a “Waterfall way”, doing large upfront designs and planning mass migrations.
For example, a company that hired one of our consultants to help with API simulation spent a year before engaging with us building a new tech platform with designs done upfront and little user feedback. The migration to the new platform was problematic due to unforeseen issues during the first usage of the new platform. The company sought our guidance on how to "force" the new platform and API simulation onto the developers and testers. Our feedback in this situation is to keep in mind the J-Curve effect and the change capacity of teams, and instead of forcing the new solution onto teams, listen to their feedback carefully. The issue with the migration could have been prevented by doing smaller-scale incremental migrations with early feedback from the developers and testers working on the projects. We have engaged with the development and testing teams and applied API simulation to solve the teams' high-priority problems, such as high numbers of UAT bugs. An incremental approach of introducing one new technique at a time allowed for building mutual trust between the transformation and project teams.
Sonya Siderova has explained the general principle very well in her article "How the J Curve Effect Defines the Success of Your Transformation Initiative"
If you need help planning a project to use API simulation to accelerate your digital transformation and create automated tests faster, feel free to reach out to us.
We are excited to announce that Traffic Parrot is part of the Tech Excellence network.
Wojciech will speak at a Tech Excellence event in May 2023, sharing his thoughts on "Testing Microservices: 12 Black-Box Testing Techniques". Stay tuned for more details by subscribing to our newsletter by leaving your email address in the field on the right!
Learn how to deliver quality software faster. Our vision is to raise the bar of technical excellence across the world.
Copyright © 2014- Traffic Parrot. All Right Reserved. TrafficParrot is a product and registered trademark of Traffic Parrot Ltd. Company number 10048075. | (UK freephone) 0800 688 9806 | +44 20 3239 7753 | contact@trafficparrot.com
Blog template by designer blogs