Working with an omnichannel brand they were transforming their approach to Analytics and Measurement. The common view from stakeholders was analytics and measurement were not having the full desired impact that it should on business performance was clearly there to be seen.
There were 3 key factors in play:
- Siloed teams internally and no of external partnered agencies involved made it problematic.
- Not having anyone internally within the business owning marketing analytics and measurement and seeing the value it can provide.
- Data collection and Insights were siloed when presented by partnered agencies or internal marketing teams it was not shared with the wider business.
The business decided to build their own in-house analytics + effectiveness team which would be part of the BI team providing an independent view of measurement then if it sat directly within the marketing team.
The opportunity was boundless for the new analytics + effectiveness team the impact it can have on the business.
The key areas of focus:
- Measurement: Take control of all measurement by having a top down approach across the wider business. Where measurement is mainly run by partnered agencies the impact was not seen beyond the media, marketing, and brand teams.
- Data Collection & Ownership: With the rich amount of data available it’s all siloed mainly saved on excel files, no consistency of data, and no single source of truth available.
- Dashboarding & Insights: Dashboarding is siloed, and platform led not providing the insights to answer business and marketing questions.
One key area that was mis-aligned was not understanding the funnel with a process of collecting good quality data and building visualisations it’s a step to bridge the gap to driving better insights.
It translated into hiring skills sets across the team: Data engineers, Data Scientist, Analyst, Digital Analytics, Strategist and Project manager.
The marketing performance team had become too reliant on digital platform data and got sucked into the world of thinking Google Analytics and custom attribution models can provide the whole truth but nothing less.
When doing a free audit of the Analytics set-up in H2 2020, there was a total 15 custom attribution models built in 3.5 years that were used since 2017. The challenges that this created:
- It created a sense of thinking short term, not long term
- ROAS became king
- Created muddled thinking in which one to trust last click or the attribution model
- There was no consistency in the data so when it came to forecasting and understanding target v delivered as different models were used every financial year
- The custom attribution models only looks at the last 4 touchpoints which is not a reflection on the end to end user journey
- The impact of brand campaigns, external factors, seasonality, and budget weighting are not exposed
Google Analytics was being used in a way that it’s not being built for. It’s a site analytics platform not a media measurement platform.
In addition to these insights my recommendation was they had to look at measurement more holistically then focusing on Google Analytics attribution as the holy grail to answer all business and marketing questions.
The approach was very much efficiency over effectiveness. This had broader implications:
- Brand budget was at 20% in 2017 by 2020 it reduced to 10% with the climate change in 2020 brand budget was cut off
- In-balanced media mix, 70% of performance budget going into Google
- In-store sales account for around 70% but online sales likely to reach around 40% in the next 18 to 24 months
- Not understanding brand metrics and its impact on sales
- Understanding the true impact of its baseline sales
- Overspending on channels / tactics and not considering the law of diminishing returns
- Without significant investment in brand, not engaging with future customers who are not active within the segment. The segment of target customers will be shrinking
- An over-reliance on promotions which has a direct impact on margins
- Not using Google Analytics for its strength to better understand traffic acquisition, on-site engagements, and user journey
The whole approach had to be ripped up.
Building the foundations
With a fully function media, marketing and brand teams, the analytics + effectiveness team is one of the final puzzles to drive growth through marketing. There was very much of a focus on effectiveness but also looking across the full spectrum of marketing data.
The process had started in H1 2021 during an uncertain landscape but having the long term vision of where the business needed to be. With the idea to bring in key hires in the next 6 to 9 months and build out the team while using partners with the goal for the team to be in place come Jan 2023. This period was vital to start the educating the business on analytics and measurement.
Q2 2021 it was all about what were the quick wins that can be implemented:
Having completed an audit in H2 2020 the overall set-up of Analytics was good it was more the approach which created the challenges. 9 months later the suggested improvements that came out of the audit was not worth implementing with all the changes taking place. Focused on 2 areas:
- With Google announcing 6 months prior about launching Google Analytics 4 it made sense to get ahead of the game and implement a minimum viable solution so that data is collected and ready when Google Analytics 3 is no longer around.
- With the flawed last click / attribution approach using GA3 one of the domino impacts was forecasting which never aligned with what was delivered. One of the biggest challenges for the marketing team was data availability which meant they had to lean into platform data being their single source of truth. This was exasperated as the marketing team had to wait 2 months for the sales team to provide sales / revenue data to feed into the forecasting. Forecast Forge (an easy, quick solution to help with forecasting) was selected to put some science behind the numbers and built a forecast for 3 years from H2 2021 covering the financial year period.
- When it came to end of financial year the delivered numbers for 21/22 was 5% under the forecasted numbers that Forecast Forge provided. Previous years delivered anything from + / – 15% to 30% v forecasted.
Fast forward to Q2 2022…
Getting Analytics and Measurement to be central to the marketing team and wider business it meant ripping up the siloed focused teams across the business, failed process and starting again which was supported by a 4 step approach:
How Analytics and Measurement can win
Understanding how Analytics and Measurement can flourish within the business was the first hurdle to overcome:
- Educating via training and workshops to empower rest of the business
- Having champions within each team not only marketing but wider business who can challenge the Analytics and Measurement teams
- Using visualisation through dashboards to win the hearts and minds
- Any project roadmaps and process created should include key team members from marketing and the wider business
Alignment of strategies
To this point there has been a misalignment of strategies between business, marketing and measurement which has resulted in going down a rabbit hole of overspending within performance marketing which was driven by Google Analytics providing inaccurate insights. It was made more apparent when looking at performance of key business metrics year over year from sales, brand awareness and market share all in decline.
To align strategies, it required to go through 5 steps to build out an analytics and measurement strategy. The Measurement canvas was used to articulate the requirements. The canvas ensures it aligns with the journey the business is going on.
It required a top down approach with the business setting clear business objectives and what are the business questions that needs to be addressed. With a shift in business strategy, it required a change in strategy going forward as the business was looking at marketing becoming a profit centre, help grow and build a brand and its brand value. Past strategies it was very much performance focused. It’s became more about effectiveness than efficiency.
When it came to selecting KPI’s it needed to consider 3 key buckets:
It had to take into consideration:
- The primary marketing KPI’s is aligned to the business KPI that can deliver against the business goals and objectives
- The brands stage of growth and select KPI’s where you can measure progress and overachieve
- The secondary marketing KPI’s needs to ladder upto the primary marketing KPI
There is no silver bullet when it comes to marketing measurement, there is not one Analytics technique that can provide all the answers to measure the short and long term impact. It will come from using a range of techniques:
The focus being on Econometrics, Brand Equity and Incrementality.
Econometrics will quantify the short to medium term impact of advertising outcomes, but it can also factor in non-advertising variables such as weather, pricing, promotions, and seasonality. Provide insights into the channel contribution and how much can it be scaled by understanding diminishing returns. Econometrics bridges the link to Brand Equity in being able to understand the base and the impact of brand awareness and consideration.
Brand Equity will quantify the long term impact of advertising on brand equity. The focus is on understanding the direct and in-direct effects and how they impact the base. What is the advertising impact on brand equity and which channels help drive brand to help optimise budgets. While also understanding the impact on creativity and which brand health measures are triggered.
Incrementality will provide the insights in how to reduce media wastage, scale and re-locate budgets. To successfully run incrementality a lot of the insights will be coming from Econometrics which is where all the focus is on.
The immediate focus is on Econometrics, but other techniques will be used:
- Share of search to help predict market share
- Paid, Earned and Owned to understand the relationship between the three with the goal to save budget on paid media
- Jones understanding the relationship between SOV + SOM with the goal to set budgets on SOV estimation
The data requirements for any analytics projects are exhaustive. The approach taken what are the data requirements not only for Analytics but also provide the right data and insights to help power the wider business not only marketing.
With Econometrics it’s all about being able to look at historic data to guide the future so the decision was made to collect data from 2017 so in total 6 years of data. The data collected will be core to the dashboards that are built that gives everyone a view on performance.
Being able to understand the digital analytics data and user journey was a core requirement and this was possible by building out a data layer. It was also an opportunity to think about the data variables that are needed and what value it can provide. The value was coming from:
- Using data layer to en-rich with other data sources to provide insights
- Segmentation to better understand website performance
- Creating customer journey funnels
- Build activation segments
Marketing Data Warehouse
With the strategy mapped out the challenge was to create a process that drives trust with marketing data and powers the use of data to all internal and external teams to drive insights. The solution was to build a marketing data warehouse with an exhaustive data collection from multiple data sources and use cases for the data it was the only viable solution.
- Proves the business value and core to executing the strategy
- Access to all raw data that is needed
- Ensures there is a data validation and data quality at the heart of the whole process
- Powers the belief that the use of excel and creating manual reports should not be happening anymore
- Visualization at the heart of driving insights and analysis
- Drives efficiency and productivity
Google Big Query was the chosen tool for the marketing data warehouse, Microsoft Azure was the other one considered but paying for Google 360 it made sense to algin with Big Query. Supermetrics was selected to help bring in the vast number of data sources into Big Query, Google sheets would also be used.
There was going to be a duplication of data, for a lot of platforms as it was possible to back-fill data from previous years which was a bonus. It was decided that a whole new naming convention was required (for channel groupings, channels, campaign placement, categorisation etc) so data from 2017 would be uploaded via Google sheets after a process of data manipulation. For most data sources it was possible to track back to 2017.
It’s a vast data set coming from a variety of sources, most will be automated there will be some that will have to be manually updated.
Which translated into a roadmap:
- Phase 1: Aligning the strategy and framework to data requirements and sources
- Phase 2: Map out the schema for all data sources with sample data set from the different sources
- Phase 3: Agree on new naming convention that can be used going forward and backfilled for past data
- Phase 4: Manually manipulating past Google Analytics, channel, and campaign data to align with new naming convention
- Phase 5: Setting up Big Query + pipelines from Supermetrics, Google Sheets and Python
- Phase 6: Building out dashboards and visualisations
The quality of data helps when it comes to measurement and modelling but also helps having richness of data for dashboards. The variety of dashboards required across marketing and the wider business was scary. The availability of data and good data provides a view that was not possible before. Some of the dashboards that provides a different context with it being automated, so the focus is on insights and optimisation:
- Match rate report: Validates quality of data
Worked on a solution to build a match rate report looking at revenue reported in Google Analytics v data warehouse. With the match-rate limit at 90% with Google Analytics data having no less then 10% discrepancy pic.twitter.com/1lFcivsFVL
— DIPESH SHAH (@mrdipeshashah) May 5, 2022
- Tracking budgets: Actual Spend v Delivered Spend
- Benchmarking: Being able to look at key metrics year on year, quarter on quarter and month on month performance
- Website funnel: Data coming from the data layer to build a funnel
- Sales split: View of where sales are coming from website or in-store
With the financial year ending in June 2023, the data collection / marketing data warehouse was completed by December 2022 allowing measurement to be front and centre for 2023. The first round of Econometrics will take 8 to 10 weeks with it delivered in April 2023, there would be a refresh and update in July 2023 to add H1 2023 data, delivered by August 2023.
The aim is to do a 2nd round of Econometrics later in the calendar year complimented by other techniques mainly Share of Search and potentially Jones.
The Econometrics will evolve in terms of speed of delivery etc in the next 12 months hopefully footfall data will be added as a variable which will be interesting to see how it bridges the gap in understanding impact of campaigns to driving in-store sales. The insights coming out of Econometrics will allow Incrementality to go live in 24/25 then Brand Equity in 25/26 which is more aligned to the change in strategy and the data required.