How I approach measurement, data architecture, and attribution for enterprise brands
Years in Digital Analytics
Enterprise Brands Instrumented
Industry Awards Won
500 & FTSE 100 Client Experience
Every engagement begins with a measurement audit. I map all existing data collection against business objectives, identify tracking gaps, and design a unified architecture before a single tag is written. This prevents the common failure mode of accumulating misaligned data over years of tactical decisions.
The architecture defines: which events fire, what properties they carry, how they join across platforms, and who owns each signal. For GA4 implementations, this means designing event schemas that are consistent across web and app surfaces, future-proof against GA4's evolving data model, and compatible with downstream BigQuery export for analysis beyond the GA4 UI.
For S&P 500 and FTSE 100 clients including BT, Dell, Microsoft, GSK, Nestlé, HSBC, Aston Martin, and Shell, measurement architecture has been the foundational work that made attribution, CRO, and reporting possible.
Last-click attribution misallocates budget. I apply a layered approach: data-driven attribution within GA4 as a baseline, augmented with media mix modelling (MMM) for channel-level incrementality, and conversion lift studies where controlled holdout groups are feasible.
For clients with complex multi-touch journeys — B2B enterprise like AON and Knight Frank, or automotive brands with long consideration windows like Aston Martin and Mercedes-Benz — I build custom attribution views in BigQuery that combine GA4 session data with CRM signals and offline conversion imports.
The output is not a single attribution number. It is a decision framework: which channels demonstrate incrementality, which are taking credit for organic demand, and where marginal spend yields marginal return.
Analytics is only as trustworthy as the pipeline that feeds it. I design and implement pipelines that ingest data from GA4, paid media APIs, CRMs, and affiliate networks into a structured warehouse — typically BigQuery — with defined schemas, load audit logs, and data quality checks at every stage.
Pipeline design follows strict layer separation: raw ingestion is append-only and never modified; business logic and transformations live in a separate processing layer; analytics views are read-only. This pattern — applied across commercial analytics and my own data platform at CannabisDealsUS — prevents the data corruption that comes from mixing concerns.
For clients with GDPR constraints, I implement consent-aware pipelines using OneTrust and server-side tag management, ensuring that only consented data flows into downstream systems while maintaining measurement continuity through modelling.
Conversion rate optimisation is not a list of best practices. It is a repeatable scientific method: observe, hypothesise, test, measure, decide. I structure CRO programmes around funnel analysis to identify high-impact friction points, qualitative research to understand user intent, and controlled A/B tests with pre-defined success metrics and minimum detectable effects.
For ecommerce clients such as Canon, statistical rigour determines test duration and sample size before any test launches — not after. For B2B clients like AON and Dell, where conversion events are sparse, I use proxy metrics and multi-step funnel analysis to measure directional impact within reasonable timelines.
Experimentation frameworks are also applied to measurement itself: incrementality experiments, holdout tests, and geo-lift studies to validate attribution assumptions rather than taking platform-reported numbers at face value.
The purpose of a report is a decision, not a number. I design reporting frameworks that map metrics to the specific decisions each stakeholder needs to make: marketing spend allocation for CMOs, channel-level optimisation for performance teams, and customer lifecycle analysis for product and CRM teams.
Dashboards are built in Looker Studio connected directly to BigQuery, ensuring that executives and analysts work from the same underlying data model. Automated weekly digests reduce the time between data and action. For market intelligence use cases — such as the CannabisDealsUS Cannabis Price Index — reporting is designed to be citable, reproducible, and archived with a permanent DOI.
The same pipeline and reporting discipline applied to enterprise clients powers the weekly Cannabis Price Index — a structured, reproducible pricing intelligence report built on BigQuery, with subcategory breakdowns across THC and CBD markets. Each edition is archived to Zenodo with a permanent DOI for academic and media citation.
View Cannabis Price Index Cannabis Pricing ResearchS&P 500 and FTSE 100 engagements, agency partnerships, and founder-led product work
GSK Nestlé BT HSBC Dell Microsoft Shell Aston Martin Mercedes-Benz AON Knight Frank Canon Aviva O2 Coca-Cola