Meta has become very good at talking about artificial intelligence. Every product update, every earnings call, every developer conference features a version of the same narrative: AI is transforming advertising, and Meta is leading it. Some of this is true. Some of it is marketing. Understanding which is which matters if you are running paid campaigns and trying to make sensible decisions about where to spend your strategic effort.
This post is not a feature overview. It is an attempt to separate the signal from the noise.
The most significant real change in Meta’s ad system over the past three years is in how the delivery algorithm processes conversion signals and connects them to potential customers. The system is substantially more capable of finding people likely to convert than it was when advertiser-managed audience targeting was the dominant approach.
This is not theoretical. The broad targeting option, running campaigns with minimal audience constraints and allowing the algorithm to identify its own signals, consistently outperforms tightly managed custom audience builds for most account types. The model has access to behavioural signals that no advertiser could replicate through manual targeting, and it processes them at a scale and speed that makes human audience management look slow by comparison.
Advantage+ Shopping Campaigns represent Meta’s most ambitious expression of this: a campaign type where the advertiser provides creative and a budget, and the system manages audience, placement, and optimisation with minimal human input. For e-commerce accounts with sufficient conversion volume and well-structured creative, the results have been genuinely competitive with manually managed campaigns.
Smart creative tools have also moved beyond novelty. Dynamic Creative Optimisation has been part of the platform for years, but the underlying capability has improved. The system can now test combinations of headlines, images, and descriptions at scale and surface genuinely useful performance differentiation, provided you are feeding it creative components that are structurally distinct rather than minor variations of the same concept.
What Meta says has changed but is more complicated
Meta’s attribution modelling has been significantly altered by iOS 14.5 and subsequent privacy changes. The platform now relies on modelled conversions to fill in the gaps left by reduced signal visibility. This is presented as a solution to the attribution problem, and in one sense it is: the reported numbers look more complete than they would without modelling.
The complication is that modelled conversions are estimates. They are statistically informed estimates built on observed conversion patterns, but they are not the same thing as measured conversions. When the platform reports that a campaign drove 200 purchases and 80 of those are modelled, you are working with a blend of measured and inferred data that is difficult to interrogate.
AI-generated creative, rolled out across Meta’s product suite as a way to produce text variations, image backgrounds, and expanded creative formats automatically, is real and functional. Whether it produces better results than well-briefed human creative work is a different question. The evidence so far suggests it is useful for generating volume at the testing stage but has not displaced strategy-led creative in terms of top-of-funnel performance or brand coherence.
Advantage+ Audience, Meta’s AI-driven audience tool, operates on similar logic to broad targeting but with some additional signals from the advertiser’s customer data. The results vary considerably by account. It works well in accounts with strong conversion signal history. In newer accounts or those with lower conversion volume, it can be slower to stabilise and less predictable in delivery.
What has not changed
The offer is still the thing. Meta’s algorithm can find the right people with remarkable efficiency, but it cannot make a mediocre offer compelling. If the product is overpriced relative to alternatives, if the value proposition is unclear, if the creative is not communicating anything worth paying attention to, the algorithm’s targeting precision becomes irrelevant. You are just finding the right people to bounce.
The creative brief still determines creative performance. Generative tools can produce variations, but they cannot produce the strategic insight that makes an ad worth testing. Understanding which customer problem to speak to, which angle resonates at the awareness versus consideration stage, which format fits the platform context: these are questions that need answering before the first frame is produced.
Measurement still requires human architecture. The platform’s attribution modelling fills in the gaps, but it does not replace the need for an independent measurement framework. If you are relying solely on Meta’s reported results to evaluate whether campaigns are working, you are reading a document Meta wrote about its own performance. Blended metrics, incrementality testing, and a clear view of business-level outcomes are still the responsibility of the human running the account.
Channel strategy has not been automated. Meta cannot tell you whether your budget should be weighted towards Facebook and Instagram or whether TikTok or YouTube would generate better returns for your specific audience and product. That is a strategic question that requires a view across the whole system.
The practical implication
The accounts that perform best in Meta’s current environment are not the ones that have handed everything to Advantage+ and stepped back. They are the ones where a human has made good decisions about creative strategy, measurement architecture, and offer clarity, and then given the algorithm the clean signal and structural freedom to do what it is genuinely good at.
That division of labour, human strategy directing automated execution, is where the value sits now. The advertisers who misread the moment are those who either refuse to engage with Meta’s AI tools at all, or those who engage with them so completely that the strategic layer disappears.
Neither extreme produces good results. The middle ground is where the performance is.

0 Comments