Hi, this is Jing, a data scientist with great passion for applying data science and big data technology in the industry. In my past few years, I have designed, implemented, and analysed a lot of web experiments (we call AB test in following text). More than half of them failed or inconclusive, but all of them brought good insights to the team in terms of how to deliver a better product to our customers. To be able to gain valuable insights, running tests at a good speed and well-structured is critical. Running an AB test self is not that difficulty, but doing a lot of tests properly with team players with different roles involved can be tricky. So certain kind of process needs to be followed so that we could run multiple tests methodically and be able to gain insights.
An AB Testing process within a product team
There are mainly 5 steps when you decided to run a web experiment within a product team. Not every experiment will have a winner and most of them will fail. So, don’t aiming for a winner, aiming for what you can learn when you think about running a test instead. If tests are running and analysed properly, you will gain insights from each of them and ideas on next experiment. Usually this process of running an AB test is called the optimisation wheel since the goal of running an AB test is to optimise the product.
- Create a solid Hypothesis from ideas
- Create a New design based on the hypothesis
- Experiment Implementation (including pre-testing, choosing the right testing tool…)
- Conduct Post-Test Analysis
- Following Actions based on learning

Once you started the optimisation, you will not only have one AB testing in the backlog. You will plan to run several tests within certain period one by one. Here is a proposal for your product team how to manage several tests without a mess: having an experiment specification and making the best use of the dev-ops tool.
Experiment Specifications
The experiment specification is the key to link all the steps in the optimisation wheel together and also the way to explain to the audience who are not involved in the test process. Besides, it will be easy to share with your stakeholders:
- Why did we run this test?
- What did we did this test?
- How did we reach this conclusion of the test?
- What did we learn and what are we going to do next?
The experiment specification will include all the information you want to know for this AB test, from Hypothesis to Executive summary. It is filled collaboratively by the team conducted the experiment at different stages.
Before Actual Experiment
- Problem Statement & Hypothesis
- Test Set-up
- Test Implementation
- Which AB test tool to use and how
- Pre-test calculation
- How long we need to run the test
- Success Metrics to decide a winner
- Screenshots from Figma (Show all the Tests looks like)
During Experiment
- Check each variants (multi-pages tests)
After Experiment
- Result Analysis
- Suggested Actions
- Executive Summary

Don’t let any information lost during the slack messages and waiting your time explain similar thing over and over again to different people. You colleagues can read by themselves if they need more details.
Process by using Dev-Ops tool
Most of the product teams are using Jira or Azure these Dev-Ops tools to manage the team’s workflow. Here is a recommendation on how to map the optimisation wheel into the Dev-Ops tool.
- For Jira, create one Epic for each experiment. Each step will be mapped out to stories. And under each story, there will be sub-tasks if the experiment is complicated.
- For Azure, create one Feature for each experiment. Each Step will be mapped out to stories as well. And under each story, there will also be sub-tasks.
According to the process, there will be 3-5 stories for each experiment and assigned to different team players.

For a lot of teams, tasks for analysts and PMs are not in the Agile board. However, they all play critical roles in the experiment. Analysts need to be onboarded as early as possible so that they can assure the quality of the test and analyse the test result properly. A proper analysis can take some time, to ensure each test carried out and planned well, it is necessary to manage analysis task together with the rest of the team. Same for the PMs, since insights and following actions from previous tests provide great value on improving the product.
End
The whole proposal starts from having a Hypothesis and have been prioritised in the product teams’ backlog already. As for how to write a Hypothesis and prioritising, I will write a separate blog about it. Thanks for reading. I am Jing, a data scientist aiming to be better and better.
2 responses to “Proposal of AB test process within Product Teams”
[…] Don’t explain everything in the hypothesis. You can add the links to those research in the ‘Insight’ part in the test specification. If you are wondering what is a test specification, go to my another blog Proposal of AB test process within Product Teams […]
LikeLike
[…] in the CRO optimization wheel, and different roles with the product team are involved. Check the Proposal of AB test process within Product Teams to read about how to make this wheel keep running with less […]
LikeLike