Publicis Groupe just acquired AdgeAI, a startup that uses AI to track exactly which creative and video assets drive engagement and conversion — and optimize campaigns in real time. This acquisition is not an isolated tech bet; it signals a structural shift in how the world’s largest agency networks are arming themselves to solve the most expensive problem in modern marketing: producing mountains of content and having no reliable way to know what actually works before you spend the media budget to find out.
This tutorial will show you exactly how AI creative intelligence works, how to integrate it into your campaign workflow, and what results you can realistically expect — using the Publicis/AdgeAI deal as the lens into a broader capability set that is now table stakes for serious performance marketers.
What This Is
AdgeAI is a startup that applies machine learning to the problem of creative performance — specifically, identifying which visual assets and video creatives are generating engagement and conversion versus which ones are draining your budget. According to the Marketing Dive report published March 13, 2026, AdgeAI uses AI to track creative and video asset performance and help optimize campaigns in real time.
The technology sits in a category now broadly called AI Creative Intelligence — a layer of analytics and prediction that sits between your creative production pipeline and your media activation. Unlike traditional A/B testing, which requires live media spend to determine a winner, AI creative intelligence tools can predict performance before a dollar is committed to distribution, and continue to surface signals about which assets deserve more investment as a campaign runs.
Publicis Groupe, one of the “Big Three” agency holding companies that now dominate the global advertising industry (alongside Omnicom Group and WPP), is folding AdgeAI into its Leona engine — the AI infrastructure that powers Publicis’ “Intelligent Creativity” strategy across paid, earned, shared, and owned channels. Publicis Groupe CEO Arthur Sadoun framed the acquisition directly: “In the AI era, brands don’t simply need more content. They need to know what works, and crucially, why, in order to immediately scale their creative messaging.”
The underlying technical problem AdgeAI is solving is real and increasingly expensive. Generative AI has destroyed the cost barrier to content production. Brands can now produce thousands of creative variants at a fraction of what it used to cost. The bottleneck has moved from production to curation — figuring out which of those thousands of assets has the visual hierarchy, emotional resonance, and CTA placement that will actually convert. That’s the problem AdgeAI’s machine learning models are trained to solve.
At the infrastructure level, research from the NotebookLM analysis of this acquisition points to an emerging class of technology called Content Sifting Storage (CSS) — systems specifically engineered to read only “sifted,” relevant metadata generated by deep learning models, reducing read latency by up to 94.8% compared to scanning full content libraries. This is the backend architecture that makes real-time creative performance tracking viable at enterprise scale.
Platforms like Entropik and Dragonfly AI operate in adjacent territory, using predictive AI to simulate how users visually scan and emotionally respond to creative content. These tools run attention heatmap modeling and emotion prediction before launch — a “pre-flight check” for creative assets that previously didn’t exist.
The AdgeAI acquisition is Publicis making this capability proprietary and tightly integrated into its Leona engine — so that when a brand running on Publicis infrastructure produces content, the creative intelligence layer is native, not bolted on.
Why It Matters
The content glut is not a future problem. It is the current operating condition for every brand and agency team reading this.
Generative AI tools have made creative volume cheap. The consequence is that media budgets are now being spent to test at scale what used to be decided in smaller, more deliberate pre-production processes. This is an expensive way to learn. When a 30-second video underperforms by 40% against benchmark because the brand logo appears two seconds too late, every dollar of media spend on that asset was partially wasted — and that waste was knowable before launch if a predictive creative intelligence layer had been in the workflow.
Research in the NotebookLM report documents the measurable impact of integrating AI creative intelligence into workflows. Brands using AI-driven creative insights report:
- 25% reduction in creative production costs
- 60% reduction in time-to-approval
- 0.4 uplift in Click-Through Rate (CTR)
- 0.35 uplift in Brand Recall
These are not projected numbers — these are documented outcomes. The CTR and Brand Recall uplifts in particular are significant: a 0.4 CTR uplift on a high-volume campaign translates directly to millions of dollars in recovered performance.
For agencies, the stakes are even higher. The Forrester analyst Jay Pattisall has articulated the underlying business model shift clearly: “Marketers will no longer buy agency talent to produce concepts. They will buy algorithms that agency talent customizes to create, activate, and scale marketing.” The agency that can prove its AI creative intelligence layer outperforms the competition is no longer selling time-and-talent — it’s selling a proprietary advantage. That’s why Publicis paid for AdgeAI: not to add a feature, but to add a moat.
For in-house marketing teams and independent agencies, this matters too. You may not have access to Publicis’ Leona stack, but the category of tools AdgeAI represents — AI creative performance tracking, predictive attention modeling, real-time asset optimization — is accessible. The difference is that you need to assemble the stack yourself, and this tutorial shows you how.
The Data
Major Agency Holding Company Acquisitions in the AI/Data Era (2015–2026)
| Acquirer | Acquired Company | Year | Deal Value | Strategic Purpose |
|---|---|---|---|---|
| Publicis Groupe | Sapient | 2015 | $3.5B | Digital transformation consulting |
| dentsu | Merkle | 2016 | $1.5B | Data-driven marketing capability |
| Interpublic Group (IPG) | Acxiom | 2018 | $2.3B | First-party data infrastructure |
| Publicis Groupe | Epsilon Data Management | 2019 | $4.4B | Identity and data platform |
| Omnicom Group | Flywheel Digital | 2024 | $853M | Commerce media and retail data |
| Omnicom Group | IPG | 2024 | $13.3B | Media scale + Acxiom data + Omni OS |
| Publicis Groupe | AdgeAI | 2026 | Undisclosed | Real-time AI creative intelligence |
Source: NotebookLM Research Report
Since 2015, major holding companies have invested nearly $27 billion in acquisitions specifically to pivot from talent-centric service models to technology-driven infrastructure, according to the research report. AdgeAI is the latest data point in that trajectory — and notably, it’s the first acquisition in this wave specifically targeting the creative performance layer rather than data acquisition or media buying.
AI Creative Intelligence: Performance Impact Benchmarks
| Metric | Baseline (No AI Creative Intelligence) | With AI Creative Intelligence | Improvement |
|---|---|---|---|
| Creative production cost | Baseline | -25% | Significant reduction |
| Time to creative approval | Baseline | -60% | Major acceleration |
| Click-Through Rate | Baseline | +0.4 uplift | Measurable gain |
| Brand Recall | Baseline | +0.35 uplift | Measurable gain |
Source: NotebookLM Research Report
Step-by-Step Tutorial: Implementing AI Creative Intelligence in Your Campaign Workflow
This tutorial covers how to build and operate an AI creative intelligence workflow — the same capability set that AdgeAI delivers inside Publicis, applied using accessible tools for teams who don’t have a proprietary agency stack.
Prerequisites
Before you start, you need:
– Access to your existing creative asset library (video, static, display)
– A campaign management platform (Google Ads, Meta Ads Manager, DV360, or similar)
– An AI creative intelligence tool — options include Dragonfly AI, Entropik, Neurons, or enterprise integrations via Publicis Leona for holding company clients
– Basic familiarity with UTM tagging and campaign performance reporting
– A consistent creative taxonomy (how your assets are named and tagged)
Phase 1: Audit and Tag Your Existing Creative Library
The first step is always the same: you cannot optimize what you cannot see. Before running a single asset through an AI creative intelligence tool, you need a tagged, organized library.
Step 1.1 — Establish a naming convention. Every asset should have metadata that captures: format (video/static/carousel), aspect ratio, product/service featured, campaign objective (awareness/conversion/retention), and version number. Example: product-launch_video_9x16_conversion_v3.mp4.
Step 1.2 — Tag performance tiers. Pull last 90 days of performance data from your ad platform. Segment assets into three buckets: Top 20% performers, Middle 60%, and Bottom 20%. This becomes your training baseline for understanding what “good” looks like in your specific category.

Step 1.3 — Extract visual features. Using a tool like Dragonfly AI or Entropik, run your top and bottom performing assets through their visual analysis engine. You’re looking for the AI’s scoring on: visual attention distribution, branding prominence and placement timing, text legibility, and emotional valence. The goal is to identify which visual features correlate with your top performers.
Phase 2: Pre-Flight Creative Testing on New Assets
This is where AI creative intelligence delivers its most immediate ROI: catching underperforming creative before it goes into media spend.
Step 2.1 — Set your attention benchmarks. Based on your Phase 1 audit, establish category-specific thresholds. For example: “Our top-performing video assets show brand logo visibility within the first 2 seconds and a 65%+ attention score on the primary CTA.” These benchmarks become your pre-flight pass/fail criteria.
Step 2.2 — Run new assets through predictive attention modeling. Upload new creative to your AI creative intelligence platform before trafficking. Tools like Entropik’s Affect Lab simulate how users will visually scan the asset, generating heat maps and attention path predictions. The Entropik team describes this as ensuring “creative should never be a gamble” — the first second of a video determines its fate, and that can be measured before launch.
Step 2.3 — Review and iterate. When an asset fails your benchmarks, you have two options: revise the creative (reposition the CTA, move the logo, adjust the color contrast) or flag it for lower-investment test placement rather than full-scale rollout. Make this a formal gate in your creative approval workflow — not an optional step.
Step 2.4 — Document the revision loop. Track every asset that failed pre-flight testing, what the AI flagged, what revision was made, and the resulting performance after launch. This feedback loop trains your creative team to internalize what the AI is measuring, progressively improving the quality of first-draft creative.
Phase 3: Real-Time Campaign Monitoring and Creative Optimization
The AdgeAI capability that Publicis is specifically acquiring is real-time creative performance tracking — the ability to surface signals about which assets are winning while a campaign is in flight, not just in post-campaign analysis.
Step 3.1 — Set up creative-level reporting. In your campaign management platform, ensure you are reporting at the creative/ad level, not just the campaign or ad set level. In Meta Ads Manager, this means using the “Ads” view with creative breakdowns. In Google Ads, this means asset-level reporting in Performance Max or responsive display campaigns.
Step 3.2 — Define your signal thresholds. Establish rules for when an underperforming asset should be paused. A practical framework:
– If an asset’s CTR falls below 50% of campaign average after 500 impressions → flag for review
– If an asset’s conversion rate falls below 60% of campaign average after 1,000 impressions → pause and replace
– If an asset’s CTR exceeds 150% of campaign average → increase budget allocation to that asset’s ad sets
Step 3.3 — Build a creative replacement pipeline. Real-time optimization only works if you have fresh creative ready to deploy when underperformers are paused. Maintain a “bench” of pre-flight-tested backup assets — at minimum, 3-5 tested alternatives for every active creative. This is the operational discipline that separates teams who benefit from AI creative intelligence from teams who have the tool but can’t act on its signals.
Step 3.4 — Run structured creative experiments. Don’t rely solely on organic performance signals. Set aside 15-20% of campaign budget for structured creative testing: systematic rotation of controlled variables (single element changes — headline only, CTA only, background color only) to generate clean data about which specific creative elements drive performance lift.
Step 3.5 — Generate weekly creative performance reports. At minimum weekly, pull the following data points: CTR by creative, conversion rate by creative, CPM by creative, frequency by creative, and attention/engagement metrics from your AI platform. Map the AI’s predictive scores against actual performance to continuously validate and calibrate your benchmarks.
Phase 4: Scale What Works
Step 4.1 — Identify winning creative patterns. After 4-6 weeks of structured testing and monitoring, you will have a dataset of top-performing assets with documented visual features. Extract the common elements: Is your best-performing video always showing a human face in the first 3 seconds? Are your top static ads using high-contrast typography on dark backgrounds? These patterns become creative briefs.
Step 4.2 — Use AI to generate variants at scale. With clear patterns documented, use generative AI tools (Midjourney for statics, Runway or Kling for video) to produce high volumes of variants that share the identified winning features. Run every generated variant through your pre-flight AI testing before trafficking.
Step 4.3 — Feed performance data back to the creative team. Close the loop. Monthly, share the top 10 and bottom 10 performing assets with your creative team alongside the AI’s analysis of why each landed where it did. This is how you build institutional creative intelligence — not just algorithmic creative intelligence.
Expected Outcomes: Teams that complete all four phases of this workflow typically see results consistent with the documented benchmarks: 25% reduction in creative costs (fewer wasted production rounds), 60% faster approval cycles (pre-flight testing replaces lengthy internal review debates), and measurable CTR and Brand Recall uplifts within 60-90 days of consistent operation.
Real-World Use Cases
Use Case 1: CPG Brand Launching a Multi-Market Campaign
Scenario: A CPG brand (think PepsiCo or Unilever scale) is launching a product across 12 markets with 40+ creative variants in 6 languages. The creative team has 3 weeks to produce, approve, and traffic everything.
Implementation: All 40+ assets are run through pre-flight AI testing against market-specific attention benchmarks. Assets that fail in specific markets (e.g., CTA placement is obscured by text overlay in Arabic right-to-left layouts) are flagged and revised before the trafficking deadline. Real-time monitoring during launch week allows the media team to shift budget away from underperformers within 48 hours of going live.
Expected Outcome: 25-30% reduction in creative revisions post-launch, faster budget reallocation to winning assets, and consistent brand standards across markets. CPG companies including PepsiCo, Unilever, and Nestlé are among those documented as leading adopters of AI-driven creative and data strategies.
Use Case 2: Independent Performance Marketing Agency
Scenario: A 15-person performance agency manages 20 e-commerce clients, each running 5-10 active creatives per platform. Manual creative review is consuming 40% of the team’s time and still missing signals.
Implementation: The agency implements a shared AI creative intelligence platform, standardizes their creative tagging taxonomy across all clients, and builds a pre-flight approval workflow into their production process. Creative performance reports are automated and delivered to clients weekly with AI-annotated explanations of what’s working and why.
Expected Outcome: Reduction in manual creative review time, faster iteration cycles for clients, and a demonstrable performance advantage the agency can use as a differentiator when pitching new business. This directly mirrors the “Algorithm of Record” model that Forrester’s Jay Pattisall describes — selling the AI layer, not just the talent.
Use Case 3: In-House Brand Team Integrating AI Creative Pre-Flight
Scenario: A DTC brand’s in-house marketing team is producing 20-30 creative assets per month using a combination of in-house designers and freelancers. Post-production creative review is a bottleneck — assets sit in approval queues for 5-7 days.
Implementation: Dragonfly AI or Neurons is integrated as a required step in the creative workflow tool (e.g., embedded in Figma, or as a standalone upload step before assets go into the approval queue). Any asset that meets the AI’s attention and branding score thresholds is fast-tracked. Assets that fail are auto-flagged with specific revision notes.
Expected Outcome: The documented 60% reduction in time-to-approval is realistic within 60 days of consistent implementation. The approval bottleneck shifts from “does this feel right?” subjective debate to “did this pass the AI benchmarks?” objective criteria.
Use Case 4: Holding Company Agency Leveraging Publicis Leona + AdgeAI
Scenario: A Publicis network agency is running an always-on campaign for a retail client across Meta, YouTube, CTV, and programmatic display — 100+ active creatives across formats.
Implementation: The AdgeAI layer within Publicis’ Leona engine continuously tracks which creative assets are driving engagement and conversion across all channels. The system surfaces real-time recommendations: pause underperformers, increase investment behind top performers, and generate creative briefs for next-cycle production based on winning patterns. This is what Publicis CEO Arthur Sadoun calls the ability to “immediately scale creative messaging” based on knowing what works and why.
Expected Outcome: The Leona + AdgeAI stack eliminates the lag between campaign performance insight and creative action — a loop that in traditional agency workflows takes days or weeks now happens in near-real-time.
Common Pitfalls
Pitfall 1: Optimizing Creative Without a Consistent Taxonomy
If your assets aren’t named and tagged consistently, AI creative intelligence tools have no way to surface actionable patterns. The system will report that specific asset IDs perform well, but you won’t be able to extract the why. Fix: Enforce your naming convention before launching any AI tooling. It’s boring work that unlocks everything else.
Pitfall 2: Using Pre-Flight Scores as Absolute Pass/Fail Gates Too Early
In the first weeks of using a predictive attention tool, your benchmarks are assumptions, not validated thresholds. Treating AI scores as hard blockers before you’ve calibrated them against real performance data will result in blocking potentially good creative. Fix: Use AI scores as advisory flags for the first 30 days. Validate against actual performance before making them hard gates.
Pitfall 3: Confusing Creative Intelligence With Creative Strategy
AI creative intelligence tells you what is working visually. It does not tell you what to say, who to say it to, or when. Teams that hand creative strategy to the AI layer and get back a result that’s technically optimized but strategically hollow will have high CTRs on messaging that doesn’t advance the brand. Fix: Use AI creative intelligence to optimize execution, not to replace strategic thinking about message and audience.
Pitfall 4: Failing to Maintain a Creative Bench
Real-time optimization signals are worthless if you don’t have replacement creative ready to deploy. The research report documents that agentic AI systems in media activation are designed to execute quickly — but they need assets to activate. Fix: Always maintain a pipeline of 3-5 pre-tested backup creatives for every active placement.
Pitfall 5: Not Closing the Loop Back to the Creative Team
AI creative intelligence generates data that should train human creative judgment over time. Teams that siloed the AI insights in the media team and never shared them with creatives miss the compound value. Fix: Monthly, share top and bottom performer analysis with your creative team. The documented 25% reduction in creative production costs comes in part from fewer wasted production rounds — and that only happens when creatives internalize what the AI is measuring.
Expert Tips
Tip 1: Run Pre-Flight Testing on Competitive Creative First
Before you calibrate your own benchmarks, use AI creative intelligence tools on your top competitors’ visible creative (ads visible in ad libraries). Understanding what the AI scores highly in your category sets a competitive baseline that purely internal testing can miss.
Tip 2: Weight Attention Metrics Differently by Objective
Awareness campaigns should be scored primarily on brand visibility and recall signals. Conversion campaigns should weight CTA prominence and visual path-to-action more heavily. Don’t apply a single scoring template across all objectives — the AI tools that allow custom weighting will generate more useful scores.
Tip 3: Use First-Party Data to Segment Creative Performance
Post-cookie, CPG leaders like PepsiCo, Unilever, and Nestlé are building first-party data infrastructure specifically to segment creative performance by audience. The same ad can perform very differently for a loyalty-program customer versus a cold prospect. Connecting your creative performance data to your first-party audience segments is a significant analytical advantage.
Tip 4: Apply the “First Second” Rule to All Video Creative
The Entropik research is explicit: “The difference between a winning campaign and wasted spend often comes down to what happens in the first second.” When briefing video creative, specify exactly what must happen in seconds 0-2: brand mark visible, key visual established, emotional tone set. Make this a brief requirement, not a post-production note.
Tip 5: Treat Creative Intelligence as Infrastructure, Not a Project
The holding companies spending billions on acquisitions like AdgeAI are not treating creative intelligence as a campaign feature — they’re making it infrastructure. Since 2015, the Big Three have invested nearly $27 billion in acquisitions to build this infrastructure stack. For independent teams, the equivalent is: make AI creative intelligence a permanent, always-on part of your workflow, not something you activate for big campaigns and ignore otherwise.
FAQ
Q1: What exactly does AdgeAI do differently from standard ad platform analytics?
Standard ad platform analytics (Meta, Google, DV360) report on creative performance after the fact — you see which ad had the best CTR, but the platform doesn’t tell you why at the visual element level, and it doesn’t predict performance before launch. AdgeAI, as reported by Marketing Dive, specifically tracks which creative and video assets are driving engagement and conversion and optimizes campaigns in real time — a capability that operates at the intersection of predictive intelligence and live campaign management.
Q2: Is AI creative intelligence only viable for large brands and holding company agencies?
No. The capabilities that Publicis is acquiring in AdgeAI have been available in standalone tools (Dragonfly AI, Entropik, Neurons, VidMob) for several years. What the AdgeAI acquisition does is make this capability native to Publicis’ stack — tightly integrated with Leona and their media activation infrastructure. Independent agencies and in-house teams can build equivalent workflows using the standalone tools; the difference is integration depth and scale, not accessibility.
Q3: How do I justify the cost of an AI creative intelligence tool to stakeholders?
Use the documented benchmarks as your business case anchor: 25% reduction in creative production costs, 60% reduction in time-to-approval, 0.4 CTR uplift, and 0.35 Brand Recall uplift. For a brand spending $500K/year on creative production and $2M/year on media, a 25% creative cost reduction is $125K in savings. A 0.4 CTR uplift on $2M in media spend, depending on your CPC, can represent hundreds of thousands in recovered performance value. Map these to your specific numbers.
Q4: How does the post-cookie era affect AI creative intelligence?
The shift away from third-party cookies primarily affects audience targeting, not creative analysis. AI creative intelligence tools analyze the creative asset itself — visual features, attention patterns, emotional response — independently of audience data. However, the most sophisticated implementations connect creative performance data with first-party audience segments. Brands like PepsiCo and Unilever are building these connections through loyalty programs and zero-party data collection as a direct response to the cookie phase-out.
Q5: What’s the relationship between Publicis’ Leona platform and AdgeAI?
Leona is Publicis Groupe’s overarching AI engine for “Intelligent Creativity” — it powers content creation, activation, and measurement across paid, earned, shared, and owned channels. The research report documents that Publicis appointed a CEO of Global Production specifically to oversee the Leona-powered intelligent creativity strategy. AdgeAI slots into Leona as the creative performance intelligence layer — the component that tracks which assets are working and optimizes in real time. Think of Leona as the operating system and AdgeAI as a critical new module within it.
Bottom Line
Publicis acquiring AdgeAI is a signal, not just a transaction. It confirms that the leading agency holding companies have completed the acquisition of data infrastructure (Epsilon, Acxiom) and media scale (the Omnicom-IPG merger), and are now competing on creative intelligence — the ability to know in real time what’s working and why. Arthur Sadoun’s framing is precise: the AI era doesn’t need more content, it needs better answers about what to scale. For practitioners not operating inside Publicis’ stack, the actionable takeaway is immediate: implement pre-flight AI creative testing, build a real-time monitoring workflow, and close the feedback loop back to your creative team. The tools exist, the benchmarks are documented, and the teams that build this workflow now will compound that advantage over the next 18-24 months as the gap between AI-native and AI-optional marketing operations continues to widen.
0 Comments