The importance of testing velocity (not speed) in growth

Article written by
Stuart Brameld
An agile, iterative approach to marketing is at the centre of many leading growth marketing teams. We have talked about the growth marketing process and the benefits in building a culture of experimentation before.
One of the best ways to measure growth marketing success is to shift the focus from monitoring outputs (traffic and leads) to monitoring inputs (experiment speed and quality).
Inputs are what you actually control.
Why testing velocity matters
When Satya Patel joined Twitter as VP of Product in 2010 one of the things he insisted was that the Twitter team move from roughly one test every other week to 10 tests per week. Twitter grew rapidly between 2010 and 2012 and it is widely believed this had a lot to do with an exponential increase in testing velocity (from 0.5 tests per week, to 10 tests per week).

Similar, Sean Ellis describes how growthhackers.com hit a plateau upon reaching 90,000 active users after the first year of launch. Without any increase in budget or changes to the team, they were able to grow from 90,000 active users to 152,000 active users in 11 weeks by dedicating themselves to high velocity testing.

"Move fast and break things. Unless you are breaking stuff, you are not moving fast enough."
Mark Zuckerberg
Geoff Charles, VP of Product at Ramp, describes how the world's fastest growing SaaS company approaches work:
"In essence, we believe that doing is better than planning. Any second you spend planning is a second you don’t spend doing. The moment you are aligned in a direction, you don’t need a high level of accuracy. It’s impossible and costly to try to predict what you can do—and part of our competitive advantage has been that we can respond very quickly to the change in environment, strategy, or customer feedback. We learn something new every day that helps us adjust our plan."
Monitoring testing velocity
There are a number of metrics and KPIs available to monitoring a growth marketing, experimentation or conversion rate optimisation programme. Essentially these boil down to 2 areas:
Increase tests run (quantity)
Increase tests won (quality)
How many tests can I run? Testing capacity (Quantity)
How many tests am I running? Testing velocity, testing coverage (Quantity)
Am I getting better? Trends over time (Quantity)
Am I running effective tests? Win rate, lift amount, expected value (Quality)
Am I running tests effectively? ROI (Quality)
Am I getting better? Trends over time (Quality)
Testing velocity is about the number of experiments run in a particular time period and tracking this trend over time, towards a goal based on your testing or experimentation capacity.
For example, assume your average test takes a month to research and start and you have a team of 5 growth marketers:
1 experiment per person per month = 5 experiments per month
1 experiment per person per week = 20 per month
The difference between speed and velocity?
Speed and Velocity are both used when describing motion, and are often used interchangeably, but which have very different meanings.
"There is a difference between “speed” and “velocity”. Speed is how fast something is going, while velocity is how fast something is going in a certain direction."
The official definition of velocity from physics is “the rate at which an object changes its position”. Unlike velocity, speed has no direction, and it simply describes how fast an object moves regardless of direction.
This is where many teams fall foul of delivering outputs over outcomes. Speed focuses on work/activities/things done, without any considering for value delivered (to the organisation, or to the customer). Velocity measures both the quantity of work done and its quality through ensuring alignment in a particular direction (typically Traffic, Leads, MQLs etc).

As growth marketers, we want to be focused on testing velocity. You can consider this as speed aligned with a specific business outcome or goal.
The more tests you run, the more you learn about how to grow your business. So it’s only natural to want to run as many tests per period of time as possible.
Agile velocity
Agile velocity is extremely popular and well known within the software development world. It is the average amount of work a software team completes during a software development iteration or sprint cycle of a given timeframe.
As above, software development teams want to release new features and functionality quickly (i.e. speed) but must also release features that add value (i.e. velocity). Customers don't care about tickets completed, or features shipped. Similarly, they don't care about how many articles you have on your blog, or the number of marketing campaigns you run every year.
Speed matters, but we also need to ensure that we are speeding in the right direction. Running into a wall faster is not a recipe for product success.
Velocity makes it easy for agile teams to estimate how much work they can achieve per sprint and how long it’ll take to get a project to a certain level of growth. It is however a metric that is calculated retrospectively (how many backlog items were delivered in the sprint?) hence is not a good way to budget or forecast.
Monitoring testing velocity
Testing velocity is a key metric we monitor in Growth Method and should be a key metric within any marketing project management software in order to ensure delivery of outcomes over outputs.
"Smart leaders realise that the key is not in doing a lot, fast, but in doing the right things, and that’s where they spend most of their energy—helping the organisation make good decisions."

Article written by
Stuart Brameld