Marketing measurement is still built on convenience.
Clicks are easy to report. Attribution dashboards are easy to screenshot. But neither answers the structural question boards are increasingly asking: what growth did media create, and what would have happened anyway?
In a recent App Talk with David Murphy, Dane Buchanan, Chief Data & Analytics Officer at M+C Saatchi Performance, positioned incrementality not as a technical upgrade, but as a shift in accountability. Attribution describes correlation. Incrementality isolates causation. The distinction becomes material the moment budget scrutiny intensifies.
Correlation is not growth
Sales movements rarely result from a single driver. Promotional cycles, seasonality, and external demand all contribute to fluctuations. Black Friday illustrates the point: revenue increases regardless of paid activity. Media may amplify that effect, but it does not generate the entire uplift.
If a campaign coincides with a 10% sales increase, some portion of that lift reflects baseline demand. Incrementality isolates the additional contribution of media. Without that separation, correlation is mistaken for causation.
The objective is not to capture consumers who were already intent on purchasing. The objective is to influence those who were not. Measurement frameworks should reflect that commercial logic.
When measurement becomes fragmented
Most organisations operate multiple measurement systems in parallel: platform attribution, media mix modelling, and controlled incrementality testing. Each method may be technically robust. In practice, they are often owned by different teams and produce conflicting interpretations.
Fragmentation shifts effort away from allocation decisions and toward reconciliation. The consequence is slower optimisation and weaker strategic clarity. A unified framework, by contrast, aligns modelling approaches into a single decision environment, reducing ambiguity around what is working, why it is working, and how budgets should adjust.
Measurement should simplify decisions, not complicate them.
Measuring real-world impact in app marketing
Source: Business of Apps via YouTube
Privacy volatility and the need for continuity
The introduction of App Tracking Transparency altered access to user-level data. Regulatory changes continue to reshape data availability across regions. When measurement depends heavily on deterministic identifiers, reported ROI can shift overnight.
An aggregated, privacy-resilient approach mitigates that volatility. By modelling performance without reliance on individual-level tracking, brands retain continuity over time. ROI measured today remains comparable with future performance, regardless of regulatory evolution.
That continuity carries internal implications. Marketing leaders frequently defend budget allocations to CFOs and CEOs. If reported returns fluctuate due to methodological changes rather than business reality, credibility erodes. Stable measurement frameworks protect that credibility.
AI as infrastructure, not label
Artificial intelligence operates within this system as an enabler rather than a headline feature. Model development benefits from accelerated iteration and automation, shortening the cycle between hypothesis and validation while maintaining human oversight.
Generative systems extend this further by translating model outputs into structured insights. Rather than simply presenting statistical results, the platform surfaces implications and recommended actions. The value lies in compressing the interval between detection and decision.
Capabilities in this area continue to advance, particularly in development velocity. Features that previously required extended engineering cycles can now be tested and deployed more rapidly.
Expanding the scope of ROI
One brand using the platform evaluated media impact exclusively against online sales, despite operating significant offline channels. When offline impact was incorporated into the modelling framework, measured ROI increased threefold. The underlying performance had not changed. The definition of performance had expanded.
With a more comprehensive view, the brand shifted from conservative allocation to confident scaling. The same framework enabled pre-campaign forecasting, allowing projected impact to inform budget decisions before spend was committed.
Improved measurement does not guarantee favourable outcomes. It may reveal inefficiency or overinvestment. That transparency is equally valuable. Capital allocation improves when incremental contribution is visible, whether positive or negative.
From reporting to decision support
The emphasis throughout the discussion was not on dashboards but on decision architecture. Measurement systems should provide a stable, privacy-compliant view of incremental impact, integrate multiple modelling approaches, and support forward-looking budget decisions.
When growth is defined as incremental contribution rather than attributed activity, media investment becomes easier to defend, optimise, and scale.



