The goal for January 2024 was to have an Analytics set-up that was bullet proof for the future. An E-commerce business that launched in 2017, had considerable growth from 2020 to 2022 in those 2 years it delivered the 5-year plan.
With growth plans and investment that will be made in the business and marketing, analytics was going to be a big part of driving the growth. Google martech would be a significant player in the ecosystem. This was the first phase in developing the Analytics capabilities in-house, it would provide the foundations needed to invest more into other areas.
With this project starting in September 2023 going into Q4 the peak period for the E-commerce business, the in-house team and agency partner were very much focused on delivering for Q4. There was a small team of myself as a consultant, developer and the head of analytics working on developing the Google martech so that the project can be delivered within the timeline.
The rationale to go live for January and to start the project in Q4:
- Budget was approved in August. September is start of the financial year
- Allows data collection for a whole calendar year and the ability to see the trends etc
- Implement the best Analytics set-up for the live websites and the same process can be duplicated for new market / website launches
- It allows other data projects in the pipeline to take place in the following year, Google Martech project is the start
Diagnosis
For how the business was set-up, focusing on localised domain approach which included .fr, .co.uk, .it, .es. What this meant for analytics there was only one Google Analytics and one Google Tag Manager which sits on multiple websites.
The current set-up was not optimum going forward:
- With the growth of the business likely to expand into new markets and launch of new websites means one Google Tag Manager could become un-manageable with the size of container impacting performance
- Each website requirements may change which are different from the others it will make it harder to manage within one Google Tag Manager
- Connecting Google Analytics to Big Query which hosts multiple websites it’s likely to hit the export limit of 1million daily events when new markets / websites are launched
The best route forward was to have a single Google Analytics and a single Google Tag Manager for each website and a Big Query connected to each Google Analytics property:
This set-up would allow:
- The management of Google Tag Manager and each market will have its own set-up that’s unique to its requirements. With multiple websites it also means the investment of time will be far greater than just managing one Google Tag Manager
- It ensures that each market Google Tag Manager will not have to worry about the size of the container and the impact on performance
- It will allow to bring in external partners for each website to provide Analytics support and not having to share access to it all
- Implementation of CMP has different requirements in every market having a single Google Tag Manager per market provides more flexibility in how the CMP can be implemented
- It ensures a higher level of data quality in Google Analytics by having one Google Analytics property for each website
- With a limit of 50 customer metrics and customer dimension per property. With having one Google Analytics property for each website there should be no concerns on data collection
- Provides a better view of the customer journey as a Google Analytics property is linked to a website
- The ability to link each website Google Analytics property to a Big Query provides a huge amount of flexibility in analysing data without restrictions of the data provided within the Google Analytics property
Building the plan
Time was the biggest factor in the implementation it required meticulous planning to ensure all deadlines can be delivered against.
- Project started w/c 18th September
- Any code changes could be deployed w/c 9th October
- Final code changes would be deployed w/c 30th October
- Code freeze from w/c 6th November for 7 weeks
- BFCM Friday 24th November and Monday 27th November
- Project had to be completed by Friday 15th December ready to go live end December / Monday 1st January
Knowing what the key deadlines are and the structure that needed to be implemented it helped build out the plan.
The key phases of the plan:
w/c 25/09 – w/c 02/10
- Duplicating the single Google Tag Manager container into multiple Google Tag Manager containers for each market / website
- Set-up of new Google Analytics properties for each market / website
- w/c 02/10 – 11/12
- Developing a staging environment which includes Google Tag Manager and Google Analytics for each website that reflects the production website
- w/c 09/10 – w/c 16/10
- Moving from a single Google Tag Manager to one Google Tag Manager per market was critical, this change had to be implemented to ensure minimal disruption and if anything were to go wrong there’s time to fix it
- w/c 30/10 – w/c 6/11
- Enhancements to the datalayer had to be implemented w/c 30th October
Implementation
Setting up new Google Tag Manager and Google Analytics for each website it provided an opportunity to do things right from the start.
The global Google Analytics for all markets would continue to collect data minimum for another year allowing for yearly comparisons. The global Google Analytics would be implemented in each of the websites Google Tag Manager. Beyond the additional year of collecting data, it’s not providing much more value with new markets / websites likely to launch it will likely become redundant in the new set-up. With Big Query connected to all websites Google Analytics a global view can be created using the data from Big Query.
The new set-up of 4 Google Tag Manager containers, 4 Google Analytics account + properties and 4 Big Query data streams.
Previously the staging environment shared the same Google Tag Manager container and Google Analytics property as the production environment. Now each website will have its own Google Tag Manager container, Google Analytics property and Big Query data stream.
Each website for production and staging will have:
- 1 x Google Analytics account, 2 x Google Analytics properties
- 1 x Google Tag Manager account, 2 x Google Tag Manager containers
- 2 x Big Query data streams
Connecting Google Analytics to Big Query it provided an opportunity to collect richer data that was not readily available in the GA4 interface. This was possible in the set-up of Google Analytics tags in Google Tag Manager.
Every Google Analytics event tag would have a similar set-up. This is an example for the Add to Cart Google Analytics tag.
Event settings variable – all the variables will be available with every Google Analytics event tag, with every tag fire the data will be collected and available in Big Query.
Set-up in each tag – for the tag_parameters will need manually updating when setting up a new tag
With the additional data collected in each Google Analytics event tag it provides a richness of data, with Big Query providing a huge amount of available data allowing to build the following views:
For any new market / website launch there’s now a Google Martech structure that can be implemented and a template for Google Tag Manager container production and staging that is ready to go live. With the use of Big Query, it will allow SQL code to be easily replicated.
Next Steps
The structure that has been implemented provides flexibility in how data solutions can be deployed that are relevant for each website:
- Implementation of CMP this is dependent on regulations and requirements
- Implementation of server-side tag manager will be driven by data challenges and needs for each website
- Linking Google Search Console to Big Query providing a richness of data and merge datasets with Google Analytics raw data in Big Query


