Request a demo
If you are in B2B doing over $1m in sales, one of our experts will discuss your goals and show how we can help you increase marketing sourced pipeline.
Subscribe below and receive full details of our in-house marketing and growth experiments, including all data and results, in real-time.
We are on-track to deliver a 43% increase in inbound leads this year. There is no doubt the adoption of Growth Method is the primary driver behind these results.
Global Director of Brand & Digital, Colt
The importance of testing velocity (not speed) in growth marketing
An agile, iterative approach to marketing is at the centre of many leading growth marketing teams. We have talked about the growth marketing process and the benefits in building a culture of experimentation before.
One of the best ways to measure growth marketing success is to shift the focus from monitoring outputs (traffic and leads) to monitoring inputs (experiment speed and quality).
Inputs are what you actually control.Andrew Chen
Why testing velocity matters
When Satya Patel joined Twitter as VP of Product in 2010 one of the things he insisted was that the Twitter team move from roughly one test every other week to 10 tests per week. Twitter grew rapidly between 2010 and 2012 and it is widely believed this had a lot to do with an exponential increase in testing velocity (from 0.5 tests per week, to 10 tests per week).
Similar, Sean Ellis describes how growthhackers.com hit a plateau upon reaching 90,000 active users after the first year of launch. Without any increase in budget or changes to the team, they were able to grow from 90,000 active users to 152,000 active users in 11 weeks by dedicating themselves to high velocity testing.
“Move fast and break things. Unless you are breaking stuff, you are not moving fast enough.”Mark Zuckerberg
Monitoring testing velocity
There are a number of metrics and KPIs available to monitoring a growth marketing, experimentation or conversion rate optimisation programme. Essentially these boil own to 2 areas:
- Increase tests run (quantity)
- Increase tests won (quality)
|Want to know||Metric||Area|
|How many tests can I run?||Testing capacity||Quantity|
|How many tests am I running?||Testing velocity, testing coverage||Quantity|
|Am I getting better?||Trends over time||Quantity|
|Am I running effective tests?||Win rate, lift amount, expected value||Quality|
|Am I running tests effectively?||ROI||Quality|
|Am I getting better?||Trends over time||Quality|
Testing velocity is about the number of experiments run in a particular time period and tracking this trend over time, towards a goal based on your testing or experimentation capacity.
For example, assume your average test takes a month to research and start and you have a team of 5 growth marketers:
- 1 experiment per person per month = 5 experiments per month
- 1 experiment per person per week = 20 per month
Velocity vs Speed
Speed and Velocity are both used when describing motion, and are often used interchangeably, but which have very different meanings.
“There is a difference between “speed” and “velocity”. Speed is how fast something is going, while velocity is how fast something is going in a certain direction.”Lukas Vermeer
The official definition of velocity from physics is “the rate at which an object changes its position”. Unlike velocity, speed has no direction, and it simply describes how fast an object moves regardless of direction.
This is where many teams fall foul of delivering outputs over outcomes. Speed focuses on work/activities/things done, without any considering for value delivered (to the organisation, or to the customer). Velocity measures both the quantity of work done and it’s quality through ensuring alignment in a particular direction (typically Traffic, Leads, MQLs etc).
As growth marketers, we want to be focused on testing velocity. You can consider this as speed aligned with a specific business outcome or goal.
The more tests you run, the more you learn about how to grow your business. So it’s only natural to want to run as many tests per period of time as possible.via GrowthHackers
Agile velocity is extremely popular and when known within the software development world. It is the average amount of work a software team completes during a software development iteration or sprint cycle of a given timeframe.
As above, software development teams want to release new features and functionality quickly (i.e. speed) but must also release features that add value (i.e. velocity). Customers don’t care about tickets completed, or features shipped. Similarly, they don’t care about how many articles you have on your blog, or the number of marketing campaigns you run every year.
Speed matters, but we also need to ensure that we are speeding in the right direction. Running into a wall faster is not a recipe for product success.
Velocity makes it easy for agile teams to estimate how much work they can achieve per sprint and how long it’ll take to get a project to a certain level of growth. It is however a metric that is calculated retrospectively (how many backlog items were delivered in the sprint?) hence is not a good way to budget or forecast.
Monitoring testing velocity
Testing velocity is a key metric we monitor in Growth Method and should be a key metric within any marketing project management software in order to ensure delivery of outcomes over outputs.
“Smart leaders realise that the key is not in doing a lot, fast, but in doing the right things, and that’s where they spend most of their energy—helping the organisation make good decisions.”https://itamargilad.com/velocity-vs-impact/
We believe testing velocity should be a key considering in your marketing strategy, and marketing operations.
Got questions? Ping me on LinkedIn or on Twitter.
Here are some external resources and articles you may find helpful.
Other articles you might like
Here are some related articles and further reading you may find helpful.
- The importance of testing velocity (not speed) in growth marketing
- How Coca-Cola Uses Marketing Experimentation as a Superpower
- Marketing experimentation best practises [+ our best performing experiments]
- Steven Bartlett on building a culture of experimentation
- Growth hypothesis best practises & high impact work