How to beat the recession with innovation, not stagnation: why now is the time to ramp up your organization's experimentation effort
The Optimizely Customer Workshop, hosted by Phil Nayna (Enterprise Account Executive at Optimizely) and Stephen Pavlovich (Founder/CEO of Conversion.com), brought together representatives from some of the UK’s biggest brands to share their thoughts and insights on Conversion Rate Optimisation (CRO). The workshop took shape in the form of a roundtable where talk topics included: “Building a lean testing program”, “Applying testing to business challenges” and, the buzzword of the moment, “Personalisation”.
Building a lean testing program
Experience in testing and experimentation amongst attendees in the room ranged from businesses who were just starting out, to those that had already produced mature testing programs. This range of experience provided the basis for a profound discussion. For UK brands just starting their testing, it was emphasised that obtaining buy-in from stakeholders was key to building a testing program within their companies. For brands with more testing experience, the biggest challenge in building this lean program came with shifting their culture. Key to adopting a testing culture is acknowledging – and leveraging – the focus on short-term testing and validation over long-term planning. That’s why the attendees all agreed that a short-term iterative roadmap is far better than a long-term rigid roadmap.
Here at Conversion.com, experimentation is at the heart of everything we do and who we are. We believe that building a lean testing program and cultivating a testing culture relies on two key factors: education and sharing. Educating your employees to understand your philosophy on experimentation and its benefits is key. This allows your employees to view experimentation as far more than just the potential value it yields with winning tests. At Conversion.com we value education highly. We run our own CRO training program for new associate consultants that educates them and allows them to think creatively and with ambition when it comes to experimentation and CRO. When sharing experiment results with clients, it’s crucial not just to share what was tested and what the results were, but more importantly why we tested it and what it can teach us about their users. This means that with every experiment, we learn more about their users, allowing us to refine and improve our testing strategy – while delivering measurable uplift.
Applying testing to business challenges (prioritization of your testing roadmap)
Strategies for prioritizing testing roadmaps varied extensively within the workshop, with all brands favouring a different approach or primary metric. One major UK supermarket brand stated that their approach was very data-driven, something we value highly at Conversion.com. They prioritized ease of implementation, lack of organizational friction in getting the test launched, potential impact the test has and the data or evidence supporting the hypothesis. Other primary metrics included cost impacts, due to one UK brand having a lack of development resource. This meant they favoured the ease of the test as a priority, as it allowed them to test despite this barrier.
At Conversion.com we believe that the data driving a test is most important when prioritizing our tests. This data informs us of the impact that the test is likely to have. Secondary metrics, such as the ease of building the test and getting sign-off – as well as the other tests and hypotheses we have running in parallel – allow us to see how and when this test fits into our roadmap. However, it is important to note that prioritization can be limited. There are finite swimlanes to test and finite resources, meaning prioritization and planning have to be coherent. Understanding that testing roadmaps have to be flexible and adaptive is key. This allows us to easily change our roadmap according to the performance of previous tests and as our understanding of users improves.
Personalisation is the buzzword of the moment in CRO and this topic divided our workshop audience. Some UK brands stated that they had banned the word completely. Instead they refer to this as creating more relevant customer experiences and concentrating on more targeted journeys. All representatives agreed that their personalization journey was at its early stages, believing it was important to keep personalization simple and start getting tests live in order to gain momentum. However, we believe that this could increase the risk of companies starting personalization too early and as a result, missing valuable opportunities for increasing their conversion rate with all-audience A/B testing. With personalization being such a hot topic, it is critical that companies take the time to integrate this into their wider digital strategies as opposed to implementing it without consideration for other key areas of CRO.
At Conversion.com, we view personalization as optimizing conversion by increasing the relevance of experiences for specific audiences. Although we see personalization as a great and exciting new opportunity to test, we believe it is important to successfully assess when to start personalization. By its nature, it forces you to focus on a subset of users, potentially diminishing the impact of experiments as well as complicating future all-audience experiments.
The Optimizely Customer Workshop was the perfect setting for valuable discussions and an insight into how the UK’s biggest brands approach experimentation. From the workshop the key takeaways were:
- Education of CRO needs to be more highly regarded within businesses in order to promote a shift in testing culture.
- Visibility of testing programs via sharing of content allows employees to understand the value of testing past just the potential value of winning tests.
- Roadmaps should be as flexible and adaptive as possible to allow for test and learn iterations to occur.
- Personalisation should be undertaken when its potential overtakes all-audience testing and should integrate with – rather than replace – typical A/B testing for CRO.