Business Owners: Look for the "What This Means for Your Business" boxes at the top of each section — get the full picture in 10 minutes.
Professionals (Media Buyers / Creative Strategists): Read everything — this is your competitive edge.
Everything in this part is based on Meta's public engineering blog posts and published research papers (Dec 2024 through Nov 2025). Meta's disclosures emphasize infrastructure and results metrics. They do not detail exactly how specific creative elements (hook type, color palette, text style) map to internal embeddings — that mapping is proprietary. The connection between creative content and system behavior is strongly implied by the architecture (creative-first retrieval, unified embeddings), but we frame it honestly throughout. When we say "your creative determines X," we mean: the architecture is designed so that creative content is a primary input signal, based on Meta's published descriptions.
Most creative strategy content tells you WHAT to do — "use strong hooks," "test different formats," "make it mobile-first." This part tells you WHY those things work at the system level. Once you understand HOW Meta's AI reads your creative, every tactical decision in Parts 3-11 will make deeper sense. You'll know not just that a lifestyle visual outperforms a product-only shot, but why it does — because it provides richer signals for Andromeda's retrieval and positions your ad more precisely in Lattice's embedding space.
Andromeda — Who Gets to See Your Ad
Meta's retrieval engine — the first gate your creative passes through before anything else happens.
Andromeda is the first filter. Based on Meta's public engineering disclosures, it evaluates your creative content and decides which of Meta's billions of users are even eligible to see it. Better creative = bigger eligible audience = more opportunities to find buyers. If Andromeda doesn't think your ad is relevant, no amount of budget or targeting will save it — those users never enter the funnel.
The Bouncer Analogy
Think of Andromeda as the bouncer at an exclusive venue with 200 VIP rooms. Your ad creative is the invitation. The bouncer reads it — not your name on a guest list, not who sent you, not which ticket package you bought — and decides which VIP rooms (user segments) you're allowed to enter.
A beautifully designed invitation with clear messaging and premium aesthetics? The bouncer opens the door to 15 rooms. A generic, unclear invitation? Maybe 2 rooms, and not the good ones.
Before Andromeda, the old system started with a question: "Who did the advertiser say they want to reach?" The advertiser's targeting settings — age range, interests, location — were the starting point. The system filtered users by those parameters first, and THEN tried to figure out if the ad was relevant.
Andromeda flips this entirely. According to Meta's engineering blog (December 2024), the system now starts with a fundamentally different question: "What is this ad about, and who would actually want it?"
This is the shift from audience-first to creative-first matching. It is, arguably, the most important architectural change in digital advertising in the last five years. And almost nobody in the advertising industry talks about it with the specificity it deserves.
The Technical Reality
Based on Meta's published research, here is what Andromeda actually does under the hood:
- 10,000x model capacity increase — Andromeda uses a custom deep neural network that achieves a 10,000x increase in model capacity compared to its predecessor, while maintaining sublinear inference cost. More capacity means it understands your creative in far greater depth.
- Hierarchical indexing strategy — The system uses jointly trained index and retrieval models. Think of it as a massive, multi-layered filing system where your ad gets placed based on what it contains, not just what category you selected.
- 100x improvement in feature extraction latency — Through dynamic feature reconstruction, Andromeda processes your creative content in real-time. The visual content of your ad, the text overlay, the format, the composition — all of it gets analyzed in milliseconds.
- Runs on NVIDIA Grace Hopper Superchip — The hardware itself is purpose-built for this kind of massive-scale neural retrieval.
The published results: +6% recall (finds more relevant users) and +8% ads quality improvement across the platform. Global rollout was completed October 2025 — Andromeda is now the default retrieval system for every ad on Meta.
To put the 10,000x capacity increase in perspective: the previous retrieval system could understand your ad at a surface level — "this is a fashion ad" or "this is a food ad." With 10,000x more capacity, based on Meta's published descriptions, the system can understand nuance: "this is a premium minimalist fashion ad featuring handcrafted accessories aimed at urban professionals who value understated luxury." That level of granularity is what enables precise creative-first matching.
Why Creative-First Changes Everything
In the old world, if you targeted "Women, 25-34, interested in fashion," the system started with that pool and then tried to match your ad to them. Your creative was secondary.
In the Andromeda world, the system looks at your ad creative and determines: "This shows a woman in her late 20s, in an urban setting, wearing premium accessories, with Arabic text overlay and lifestyle imagery. The visual signals suggest premium fashion, young professional, metropolitan buyer."
Then it retrieves users who match that signal profile — regardless of what you typed into the targeting box.
This is why, in 2026, a media buyer who puts all their effort into audience targeting settings but uses generic creative is fundamentally misunderstanding how the system works. The creative IS the targeting input that matters most to the retrieval layer.
A Concrete Example
Consider two ads for the same leather handbag brand, with identical targeting settings (Women, 25-40, UAE, interest in fashion):
Ad A: Product photo on white background, brand logo, "Shop Now" text. Generic, standard e-commerce creative.
Ad B: A woman in a Dubai Marina cafe, the handbag on the table next to an espresso, warm afternoon light, Arabic calligraphy brand tag visible, close-up showing leather grain texture.
In the old system, both ads would be shown to the same pool (Women, 25-40, UAE, fashion interest). In the Andromeda system, Ad B's rich visual signals — lifestyle context, premium setting, specific demographic cues, craftsmanship details — would retrieve a fundamentally different (and likely much more qualified) pool of users than Ad A's generic product shot.
Same product. Same targeting settings. Same budget. Completely different audiences reached. The creative made the difference at the retrieval layer — before ranking or bidding even began.
The Real-Time Processing Advantage
The 100x improvement in feature extraction latency means this is not a batch process that happens once when you upload an ad. According to Meta's engineering descriptions, dynamic feature reconstruction enables real-time processing. As your creative accumulates engagement data, the system's understanding of your creative deepens. The embedding representation of your ad becomes more refined over time, which can shift the retrieval pool — sometimes significantly — in the first hours after launch.
This is why some advertisers notice that performance changes in the first 24 hours even without making any changes. The system is learning your creative in real-time and refining which users it retrieves.
If Andromeda evaluates creative first, then two advertisers with identical targeting settings but different creative will be shown to completely different pools of users. Your creative doesn't just influence performance — it determines who is even eligible to see your ad.
Lattice — Matching Ads to Humans
The unified ranking model that replaced hundreds of siloed systems with one massive intelligence.
Lattice understands the relationship between what your ad shows and what each person actually does. It doesn't just know someone "likes fashion" — according to Meta's published research, it models behavioral patterns across hundreds of billions of examples. It knows they buy premium leather goods on Thursday evenings after browsing 3+ product pages. Your creative content determines WHERE in Lattice's embedding space your ad sits, which determines WHICH users it gets matched to.
One Brain to Rule Them All
Before Lattice, Meta ran hundreds of separate prediction models — one for click probability, another for conversion probability, another for video watch time, and so on. Each model had its own understanding of users and ads. They didn't share knowledge.
According to Meta's AI blog and the accompanying arXiv paper (December 2025), Lattice replaced all of this with a single unified architecture: shared trunk + task-specific heads.
In plain language: one massive brain understands everything (the shared trunk), and then specialized skill layers handle each specific prediction (the heads). The brain has trillions of parameters — more than most large language models — and was trained on hundreds of billions of examples across thousands of data domains.
This is not a minor upgrade. This is a fundamental rearchitecting of how Meta understands the relationship between ads and people.
Why does the shared trunk matter? Because a model that understands clicking behavior can now share that understanding with the model that predicts purchases. A model that learns about Instagram Stories engagement can apply those patterns to Facebook Reels predictions. Nothing is siloed anymore. Every behavioral signal across every surface feeds into one unified understanding.
The Embedding Space — Your Ad's Address in the Universe
Here is the concept that should change how you think about creative forever:
Lattice creates unified embeddings — users AND ads are mapped into the same vector space. Imagine a massive, multi-dimensional map. Every Meta user is a dot on this map. Every ad is also a dot on this map. The position of each dot is determined by hundreds of behavioral and content signals.
Your ad's position on this map is determined primarily by your creative content. The visual style, the copy, the format, the hook, the overall composition — all of these signals place your ad at a specific coordinate in the embedding space.
Users who are close to your ad's position in this space are the ones most likely to convert. Users who are far away won't see it at all.
Here is the key insight: you can't directly move your ad's dot on this map by changing targeting settings in Ads Manager. The dot's position is determined by the creative content itself. If you want your ad to be near high-value buyers, you have to MAKE creative that signals high-value positioning. If your creative looks cheap, it will be placed near users who respond to cheap-looking content. If your creative looks premium, it will be placed near users who respond to premium content.
This is not a metaphor. This is a literal description of how unified embedding spaces work, based on well-established machine learning principles and confirmed by Meta's published architecture.
Meta's published papers describe the unified embedding architecture and confirm that ad content is a primary input to the embedding model. They do not publish the exact mapping of "blue background = this position" or "urgency copy = that vector." The specific creative-to-embedding mapping is proprietary. What we know is the architecture is designed to use creative content as a primary input, and the results confirm it works: up to 6% more conversions. The practical implication — that creative content directly influences ad placement in the ranking system — is well-supported by the published architecture.
Temporal Awareness — The Full Journey
Most advertising systems optimize for immediate signals: who clicked? Lattice goes further. According to Meta's published research, it captures both immediate signals (clicks, taps, swipes) AND delayed conversions (purchase patterns that happen days after first exposure).
This temporal awareness means Lattice can predict: "This person who saw your premium handbag ad on Tuesday will browse your website Thursday and purchase Saturday." It models the complete conversion journey, not just the first click.
For your creative, this means the system can identify which ad formats and styles drive high-value delayed purchases vs. which ones drive impulsive clicks that don't convert. An ad that gets fewer clicks but more 7-day purchases will be ranked higher by Lattice than one that gets lots of clicks but no sales.
This has a profound implication for creative strategy: optimizing for clickbait hooks that get lots of clicks but low conversions will actually hurt your ad's ranking in Lattice. The system is smart enough to see through surface-level engagement and optimize for the actions that actually matter — purchases, sign-ups, high-value conversions.
Creative that drives genuine purchase intent — even if it gets fewer clicks — will be ranked higher than creative that drives curiosity clicks from people who never buy. Lattice rewards honesty and clarity in creative because honest creative attracts users whose downstream behavior (conversion) matches the engagement signal (click).
Multi-Domain Learning
Lattice learns across Instagram Feed, Instagram Stories, Instagram Reels, Facebook Feed, Facebook Reels, and every other placement simultaneously. What it learns about user behavior on Instagram Stories feeds into its predictions for Facebook Reels automatically.
The published result of this unified approach: ~8% ads quality improvement, ~12% ad quality increase in some domains, and up to 6% more conversions. These are platform-wide averages — individual results vary based on (you guessed it) creative quality.
Lattice uses Pareto optimization across thousands of domains and dozens of objectives simultaneously, meaning it doesn't just optimize for one metric — it finds the best possible tradeoff across all the things that matter.
What This Means for Creative Testing
The unified nature of Lattice has a direct implication for how you should think about creative testing. In the old world of siloed models, a creative that worked on Instagram Stories might fail on Facebook Feed — because different models evaluated it differently. You had to test everywhere separately.
With Lattice's shared trunk architecture, a creative that demonstrates strong conversion signals on one surface is understood by the same model that evaluates all other surfaces. This doesn't mean results will be identical across placements — the task-specific heads still account for platform differences — but it means the underlying understanding of your creative's quality transfers.
Practically, this means you can be more efficient with testing. A creative concept that proves itself on one high-volume surface gives the model enough data to make informed predictions about other surfaces, reducing the total testing budget needed. This is a significant operational advantage — you can test on your highest-volume placement and let Lattice extrapolate performance predictions to other surfaces.
We will explore the practical implications of this for testing strategy in Part 8 (Testing & Scaling System).
Closer users = higher conversion probability.
GEM — How Much Meta Invests in YOUR Ad
The Generative Ads Model — a foundation model that decides your ad's exploration budget.
GEM decides how aggressively Meta looks for new buyers for your ad. Based on Meta's November 2025 engineering disclosures, strong creative triggers GEM to explore wider — meaning you discover customer segments you never knew existed. Weak creative causes GEM to play it safe — you stay stuck with the same small, familiar audience. Creative quality literally determines your potential reach.
The Exploration Problem
Every ad delivery system faces a fundamental tradeoff: exploitation vs. exploration.
Exploitation means showing your ad to people who are proven converters — safe, predictable, but limited in scale. Exploration means trying your ad on new user segments who MIGHT convert — risky, but potentially unlocking massive new audiences.
A cautious system exploits. A bold system explores. The optimal strategy is somewhere in between — and the right balance depends on how confident the system is in the ad's quality.
GEM, according to Meta's engineering blog (November 2025), is designed to make this decision intelligently — and your creative quality is a primary factor in how bold it gets. Think of it as a venture capitalist. GEM has a budget of "exploration impressions" to invest. It wants to invest those impressions in ads that are likely to find new converting audiences. The stronger your creative's initial signals, the more GEM is willing to invest in exploring new territory for you.
Built on LLM-Scale Architecture
GEM stands for Generative Ads Model. It's a foundation model built on the same large language model (LLM) architecture that powers tools like ChatGPT — but specialized entirely for ad recommendation. According to Meta's published disclosures, it processes:
- Advertiser goals — what outcome you're optimizing for
- Creative formats — video, carousel, image, the composition of each
- Measurement signals — real-time performance data from the ad's history
- User behaviors — engagement patterns across all of Meta's surfaces
The result is 4x more efficient than previous ranking models and 2x the effectiveness of standard knowledge distillation techniques. Published results: +5% conversions on Instagram, +3% on Facebook Feed.
Cross-Surface Learning
One of GEM's most powerful capabilities, based on Meta's published descriptions, is cross-surface learning. A creative pattern that drives purchases on Instagram Stories gets automatically applied to predictions for Facebook Reels. You don't have to test on every placement separately — GEM transfers what it learns.
This is significant because it means your best creative insights compound across the entire Meta ecosystem. A winning hook style on Reels doesn't just help Reels — it informs the model's predictions for every surface.
Before GEM, if your ad performed well on Instagram Stories, that didn't help your Facebook Feed performance at all. The models were separate. Now, a creative pattern that drives a 3x ROAS on Instagram Stories might trigger GEM to test that same pattern's potential on Facebook Reels, Explore, and Feed — automatically, without any action from you.
The Virtuous Cycle (and the Death Spiral)
Here is where GEM's impact becomes exponential — in both directions:
- You publish strong creative
- GEM identifies strong initial engagement signals
- GEM expands its exploration budget — tries your ad on new user segments
- Some of those new segments convert at high rates
- That conversion data feeds back into GEM
- GEM now has MORE evidence to explore even wider
- Your reach and conversions grow, potentially exponentially
The inverse is equally true. Weak creative → poor initial signals → GEM restricts exploration → you're stuck in a small pool → performance stagnates → GEM restricts further. This is the death spiral that many advertisers experience without understanding why.
How many times have you heard an advertiser say "the algorithm isn't working for me" or "Meta just doesn't deliver my ads"? In most cases, the algorithm is working exactly as designed. It received weak creative signals, determined that exploration was not justified, and restricted delivery accordingly. The problem was never the algorithm — it was the creative input.
Understanding this reframes the conversation entirely. Instead of asking "why isn't Meta spending my budget?" the right question becomes: "what signals is my creative sending that make GEM reluctant to invest in exploration?" The answer is almost always one of these:
- The creative doesn't clearly communicate who it's for (weak Andromeda signals)
- Initial engagement doesn't translate to conversions (confusing Lattice rankings)
- The creative looks like content that historically underperforms (GEM's learned caution)
- Conflicting creative elements create ambiguous signal profiles across all three systems
You've probably seen it: an ad performs moderately for 2-3 days, then suddenly explodes in reach and conversions. This is likely GEM's exploration kicking in. The initial data was strong enough to trigger expanded exploration, which found high-converting segments, which generated more data, which triggered more exploration. The creative quality was the spark that started the chain reaction.
The Complete Signal Chain
How all three systems work together — and why creative is the single input that matters most.
Your creative goes through three AI systems. Each one uses what your ad shows to make decisions. The better your creative communicates who it's for, the better all three systems perform. You're not making ads — you're sending signals to three interconnected AI systems that collectively decide the fate of every dollar you spend.
The Three-System Architecture
Visual, text, audio, format ingested
10,000x capacity, creative-first matching
Trillions of parameters, unified embeddings
Foundation model, cross-surface learning
Real-time learning, compounding signals
Let's bring it all together. Based on Meta's published engineering disclosures, here is the complete path your ad takes from upload to delivery:
What makes this three-system chain so powerful — and so different from anything that existed before 2025 — is that each system builds on the previous one's output. Andromeda doesn't just pass a list of users to Lattice. It passes a context-enriched list that includes the creative signals it extracted. Lattice doesn't just pass rankings to GEM. It passes conversion probability distributions that GEM uses to calibrate its exploration budget. The information compounds at every stage.
The initial data from the first few hundred impressions sets the trajectory for the entire campaign. If those first impressions generate strong engagement and early conversion signals, the feedback loop accelerates in a positive direction. If they generate poor signals, the system restricts delivery early and it becomes very difficult to recover — even with budget increases. This is why launching with your strongest creative first is not just good practice — it's a system-level requirement for the three-stage architecture to work in your favor.
Every Creative Element Sends Signals to ALL Three Systems
This is the critical insight: your hook doesn't just affect one system. Your visual style doesn't just influence Andromeda. Every element of your creative sends signals to all three systems simultaneously.
- Your hook → tells Andromeda who to retrieve, tells Lattice where to place the ad in embedding space, tells GEM whether initial engagement justifies exploration
- Your visual style → signals premium vs. budget positioning to all three systems, influencing user pool quality at every stage
- Your copy → qualifies intent level, affecting both retrieval and ranking predictions
- Your CTA → signals expected user action, feeding into Lattice's conversion probability and GEM's optimization objective
This is why creative strategy is THE most important skill in 2026. You're not just making an ad. You're programming three AI systems simultaneously.
What Happens When You Get It Right (vs. Wrong)
When your creative sends strong, aligned signals through all three systems, the result is a compounding effect:
- Andromeda retrieves a large, highly relevant candidate pool (because your creative clearly communicates who it's for)
- Lattice ranks those candidates with high confidence (because the embedding position is clear and well-defined, not ambiguous)
- GEM allocates a generous exploration budget (because early conversion signals are strong, justifying wider search)
- The feedback loop accelerates: more data, better predictions, wider exploration, more conversions
When your creative sends weak or conflicting signals, the opposite happens:
- Andromeda retrieves a small or poorly-matched candidate pool (because it can't determine who the ad is for)
- Lattice struggles to rank candidates (because the embedding position is ambiguous — the ad sits in a "no man's land" between user clusters)
- GEM restricts exploration (because early signals are inconsistent — some clicks but no conversions)
- The feedback loop stalls: limited data, uncertain predictions, restricted exploration, stagnant performance
This is the difference between an ad that reaches 500,000 people and generates 200 conversions at a profitable CPA, and an ad that reaches 15,000 people and generates 3 conversions at an unsustainable cost. Same product. Same budget. Same targeting settings. Different creative signals.
TikTok: Creative Matters Even MORE
On TikTok, there is zero social context. Creative is literally the only signal.
On TikTok, there is zero social context — the algorithm doesn't know your friends or your page likes. It ONLY has your content behavior. This means creative is not just the primary targeting signal — it is the only targeting signal. If your creative doesn't clearly communicate what it is within the first 1-2 seconds, TikTok has no backup data to fall back on. Your ad just fails silently.
Meta vs. TikTok: A Critical Difference
Meta at least has a social graph. It knows who your friends are. It knows which pages you liked. It knows which groups you joined. Even before Andromeda's creative-first approach, Meta had supplementary data signals to work with.
TikTok has none of that. Zero. When a new user joins TikTok, the algorithm knows nothing about them except what content they engage with from minute one. There's no pre-existing social profile to lean on. No friend recommendations. No page history. Just raw content interaction data.
TikTok's For You algorithm is 100% content-graph based. It watches what content you engage with — what you watch, what you skip, what you rewatch, what you share — and matches NEW content (including ads) based on content similarity. There is no "friend of a friend liked this" signal. There is no "you follow this page, so you probably like this" inference.
It is purely: "This ad's content signals look like content this user has engaged with before."
This makes TikTok's advertising system both simpler and more demanding than Meta's. Simpler because there's one dominant signal (content). More demanding because if that one signal is weak, there's nothing else to compensate.
What This Means for Ad Creative on TikTok
On Meta, a mediocre creative can sometimes survive because the social graph and historical behavioral data provide enough signal for the system to find SOME relevant users.
On TikTok, there is no safety net. If your creative doesn't clearly and immediately communicate:
- What category this is (fashion? tech? food? fitness?)
- Who this is for (young professional? parent? athlete?)
- What the value proposition is (discount? quality? novelty?)
...then the algorithm has nothing to work with. Your ad gets shown to essentially random users, generates poor engagement signals, and gets suppressed within hours. There is no recovery mechanism. You can't fix it by increasing budget or adjusting targeting. The creative failed to communicate, and on TikTok, that failure is terminal for that specific creative asset.
The First 1-2 Seconds Are Everything
On TikTok, the algorithm makes its content-matching decision based heavily on the opening frames of your video. This is not just about "hooking" the viewer (though that matters too). It's about giving the algorithm enough visual and audio signal to categorize your ad and match it to the right content-consumption patterns.
A video that starts with a slow logo reveal gives the algorithm almost nothing to categorize. It sees: a logo. That tells it nothing about who would want to watch this. A video that starts with a person holding a product in a specific lifestyle context gives the algorithm a rich set of content signals: product category, aesthetic positioning, demographic cues, lifestyle category, potential interest clusters.
This is also why TikTok-native content (shot vertically, using in-app editing, featuring real people) tends to outperform polished TV-style ads on the platform. It's not just that users prefer "authentic" content — it's that TikTok-native content gives the algorithm more usable signals for content-graph matching. The algorithm has billions of examples of similar content to reference, billions of patterns to match against, billions of data points for categorization. A polished TV-style ad sits in a sparse region of TikTok's content graph — there's less data to match against, which means less confident delivery.
TikTok's Feedback Loop Is Faster and More Brutal
On TikTok, the feedback loop is extremely fast. If users in the first batch (usually a few hundred people) don't engage with your content, the algorithm immediately suppresses distribution. There's no second chance. There's no "let me show it to a different audience and see." If the content signals don't work for the first test group, the content gets buried.
This means your creative has to work in the VERY first impression batch. You don't get the luxury of "let the algorithm learn over a few days" that you sometimes get on Meta. TikTok's algorithm makes its initial judgment fast, and that judgment is heavily influenced by the creative content signals in the first 1-2 seconds of the video.
TikTok's exact algorithm is as proprietary as Meta's. What is well-established from TikTok's own published transparency reports and industry analysis is that TikTok relies primarily on content interaction signals rather than social graph data. The implication for creative — that your content itself is the primary determinant of delivery — is directionally accurate and widely supported, even if exact internal mechanisms are not publicly disclosed.
What Winning TikTok Creative Looks Like (From a Signal Perspective)
Understanding TikTok's content-graph system leads to specific creative principles:
- Open with action, not branding. Show the product in use, a person reacting, a transformation beginning. This gives the algorithm visual content signals immediately.
- Use environmental context. A product shown in a kitchen, a gym, an office, a car — each environment is a content signal that helps the algorithm categorize and match.
- Let the person carry the signal. Age, style, body language, and energy level of the person in your ad are powerful content signals that help TikTok match to the right user profiles.
- Audio matters for matching. The sound your video uses is a content signal. Trending audio, voiceover style, music genre — all of it feeds into the content graph.
- Text overlays are readable. TikTok's system can process text overlays. Clear, readable text that states the value proposition gives the algorithm another signal to work with.
The Implication for Multi-Platform Creative Strategy
If you run ads on both Meta and TikTok, the lesson is the same from both platforms — but the urgency is different:
- On Meta (2026): Creative is the PRIMARY targeting signal thanks to Andromeda, Lattice, and GEM. Social graph data still exists as a supplementary signal.
- On TikTok: Creative is the ONLY targeting signal. There is no supplementary data. No safety net. No fallback.
This is why the creative strategy framework in Parts 3-11 applies to both platforms. The principles of communicating clearly through creative — the hook, the visual, the copy, the format — are universal. TikTok just makes the stakes even higher.
Build your creative to satisfy the strictest standard — TikTok's content-only matching — and it will perform well on Meta too. The inverse is not true. Creative designed for Meta's more forgiving system (with social graph backup) often fails on TikTok because it doesn't communicate clearly enough through content alone.
Design for TikTok's standard. Deploy everywhere. Adapt the format, but keep the signal clarity.
What This Means for Every Creative Decision You Make
The practical translation: every image, every video, every word is programming three AI systems.
Bottom line: every image, every video, every word in your ad is telling three AI systems who to show it to. Stop thinking of creative as "making ads." Start thinking of it as "programming AI targeting." The framework in Parts 3-11 teaches you exactly how to do that.
The New Mental Model
Before reading this part, you probably thought of creative as "the thing people see." That's how 99% of advertisers think about it. Creative is a deliverable. Something the design team makes. Something you attach to a campaign.
After understanding Andromeda, Lattice, and GEM, the mental model should shift permanently:
Your creative is a set of instructions to three AI systems. Those instructions determine who sees your ad, how they're ranked, and how aggressively the system explores for new audiences. Every creative decision you make — consciously or not — is an input to this machine.
The question is: are you programming it intentionally, or are you sending random signals?
Most brands are sending random signals. They make creative decisions based on what "looks good" or what the designer preferred or what the last campaign used. They don't realize that every visual choice, every word of copy, every formatting decision is being read by AI systems that will use those signals to determine which humans see the ad and whether to invest in finding new audiences.
The brands that understand this — and intentionally craft creative to send the RIGHT signals to the RIGHT systems — are the ones that scale profitably in 2026. Everyone else is leaving performance on the table without knowing why.
Five Practical Implications
One of the most destructive things you can do in 2026 is send conflicting signals through your creative. Example: a premium visual style (luxury photography, elegant composition) combined with a discount hook ("70% OFF!!!"). These signals contradict each other. Andromeda doesn't know whether to retrieve premium buyers or bargain hunters. Lattice places your ad in an ambiguous position in embedding space. GEM sees mixed engagement signals and restricts exploration. The result: poor performance that nobody can diagnose because "the ad looks fine."
Alignment matters. Every element of your creative should tell the SAME story to the AI systems. Premium visual + premium copy + premium positioning. Or value-focused visual + value-focused copy + value-focused positioning. Not a mix.
The Framework Ahead
Now that you understand HOW the machine works, the rest of this framework teaches you to USE these signals strategically:
- Part 3 — The Reverse Engineering Framework: how to look at your winning creatives, break down WHY they won (which signals they sent), and extract the DNA
- Part 4 — The 9 Creative Elements: a comprehensive checklist of every signal your creative sends
- Part 5 — AI Vision Analysis: using AI to read your ads the way the algorithm reads them
- Parts 6-11 — How to analyze, generate, test, scale, and operationalize creative using these principles
Everything from here forward is built on the foundation you just learned. The three systems. The signal chain. Creative as targeting. This is the lens through which every future creative decision should be made.
A Quick Summary of What You Now Know
| System | Function | Creative's Role | Published Result |
|---|---|---|---|
| Andromeda | Retrieval — who is eligible | Primary input for creative-first matching | +6% recall, +8% quality |
| Lattice | Ranking — who will convert | Determines position in unified embedding space | +12% quality, +6% conversions |
| GEM | Exploration — how far to reach | Quality determines exploration budget | +5% IG conversions, +3% FB |
These three systems work together on every single ad impression. They are running right now, on every ad you have active, making decisions based on your creative content. The question is not whether they affect you — they do, by definition. The question is whether you're giving them the right inputs.
When you move on to Part 3 — the Reverse Engineering Framework — you'll learn how to analyze winning creatives through this lens. Instead of saying "that ad worked because it looked good," you'll be able to say "that ad worked because it sent clear, aligned signals to Andromeda's retrieval, positioned itself precisely in Lattice's embedding space, and gave GEM enough confidence to explore aggressively." That is a fundamentally different — and far more useful — level of understanding.
Most advertisers in 2026 still think of creative and targeting as separate activities. They set up targeting in Ads Manager, then hand a brief to a designer. They don't understand that the designer IS setting the targeting — through every visual, copy, and format choice they make.
By understanding Andromeda, Lattice, and GEM, you now have a framework that less than 1% of advertisers possess. You understand not just that creative matters, but exactly how it matters — at the retrieval layer, at the ranking layer, and at the exploration layer. This is your edge. Use it.
+6% Recall
+8% Quality
+5% Conversions
Built by @itsmazinzaki — AVAMARTECH
Framework v1.0 — April 2026