IF DESIGNERS WON’T MEASURE EFFECTIVENESS, AI WILL
I posted a reel recently about the gap between how creatives and clients evaluate AI-generated work. The most common pushback, by a mile, was some version of this: you can't really measure the effectiveness of creative work. And I get why people think that. Because most of us were never trained to think that way.
Design school teaches you composition. Colour theory. Typography. Craft. Taste. All of that matters deeply. But somewhere between learning to kern and learning to present, nobody ever sat us down and said: what is this design supposed to achieve? And how will you know if it worked?
That gap has always been there. But it didn't used to be dangerous. Now it is. Because AI doesn't have that gap. AI doesn't care about aesthetics. It cares about what converts. What holds attention. What performs. And if the designer in the room can't articulate what their work is supposed to achieve in measurable terms, the machine will happily make that decision for them.
Ivan Chermayeff, the man behind the brand identities for Chase, NBC, PBS, and National Geographic, defined it clearly: "Design is directed toward human beings. To design is to solve human problems by identifying them and executing the best solution." Not the prettiest solution. Not the most awarded solution. The best solution. And "best" only means something when you've defined what the design is supposed to achieve.
So let's talk about what design effectiveness actually is, why most studios still aren't thinking about it, and why the rise of AI makes this the most urgent capability gap in our industry right now.
What is design effectiveness, and why don't most designers talk about it?
Design effectiveness is the measurable impact a piece of design has on the goal it was created to achieve.
That's it. Not how it looks. Not how it feels. Not whether it won an award or got featured on Best of Behance. Whether it did the thing it was supposed to do.
The Design Business Association in the UK has been running the Design Effectiveness Awards since 1989. Not 2024. Not since AI showed up. Since 1989. For over three decades, brand design studios have been submitting work and proving, with actual evidence, that their design delivered a tangible, measurable impact on business.
And these aren't ad agencies or growth-hacking startups. These are brand design studios. Packaging studios. Identity agencies. The kind of studios most of you either work in or aspire to run.
B&B Studio won the 2024 Grand Prix for their brand creation work on Mockingbird Raw Press, a smoothie brand that went from zero to the UK's fastest-growing smoothie brand and the number one contributor to overall category growth. That wasn't a website conversion metric. That was design reshaping a category. Changing how consumers perceived a product tier. Making people understand, through design, that this was worth paying more for.
White Bear rebranded Tom Parker Creamery, a 100-year-old local dairy, and the design unlocked national listings in 733 Sainsbury's stores, 300 Waitrose locations, and Ocado. Design effectiveness isn't always about click-through rates. Sometimes it's about transforming how a market sees you.
In 2026, the Grand Prix went to Xfacta's work for GoSolr in South Africa, repositioning solar energy to feel accessible and empowering during a national energy crisis. Design as education. Design as perception shift. Design as behaviour change.
So when someone says "you can't measure creative effectiveness." The DBA has thirty-five years of evidence that says otherwise. The question isn't whether it's possible. The question is why most studios still aren't doing it.
What does design effectiveness actually look like beyond conversion metrics?
This is where the conversation usually gets stuck. People hear "measurement" and immediately think Google Analytics. Bounce rate. Sign-up conversions. A/B test results.
Those are valid. But they're not the whole picture. Not even close.
Design effectiveness includes:
Perception shift. Did the design change how people perceive a brand? Wolff Olins repositioned Orange from a stodgy telecom into a lifestyle brand in the 1990s. "The future's bright. The future's Orange." That wasn't a conversion metric. It was a fundamental shift in how millions of people related to a phone network. Measurable through brand tracking studies, sentiment analysis, and ultimately subscriber growth.
Category creation and disruption. Morrama's design work for Wild launched the world's first refillable deodorant. The design didn't just sell a product. It educated a market about an entirely new behaviour. That's measurable through category awareness research, first-mover adoption rates, and market share in a segment that literally didn't exist before.
Education. GoSolr's brand design made solar power feel achievable for ordinary South Africans during an energy crisis. The design's job wasn't to "look modern." It was to make people understand that this was something they could do. That's measurable through adoption rates, consumer comprehension studies, and market penetration.
Brand equity and commercial unlock. Tom Parker Creamery's rebrand didn't just look better. It changed the brand's commercial trajectory—from a regional dairy to a nationally listed brand. Measurable through retail listings growth, year-on-year revenue, and units sold.
Audience understanding. Fred. Olsen Cruise Lines' digital redesign by Else drove a 60% increase in website engagement and a 50% improvement in browsing-to-booking conversion. That's both a perception metric (people understand the brand better online) and a commercial metric (more people booking).
The point isn't that every design project needs a spreadsheet. The point is that every design project needs a goal. And once you have a goal, you have something to measure against.
Why does effectiveness start with strategy, research, and the brief?
Here's where most studios go wrong. And it's not at the measurement stage. It's long before that. It starts with the questions nobody asked before the work began.
The brief is part of it. But the brief isn't the beginning. Strategy and research are. Before you can write a goal into a brief, you need to understand the landscape you're designing into. Who is the audience, really? What does the competitive set look like? What perceptions exist right now, and which ones need to change? What does the brand mean to people today, and what should it mean after this project?
That's strategy work. That's research work. And it's where design effectiveness is actually born. The brief is the document that captures what you've uncovered. But the uncovering is where the value lives. We teach this entire process in the AI Branding Masterclass.
If a client asks you to redesign their website and the brief says "make it modern and clean," that's not a brief. That's a mood board note. There's nothing to measure because there's nothing to aim at. But if you've done the research and you know that 70% of first-time visitors leave within three seconds because they can't figure out what the company does, now you have a real problem to solve. The brief writes itself: "First-time visitors understand what we do within five seconds."
A brief that enables design effectiveness looks different. It defines what the design needs to achieve before anyone opens Figma:
"We need first-time visitors to understand what we do within five seconds." Now you're designing for comprehension. You can run a five-second test with real users and measure whether the design achieved it.
"We need to position this brand as the premium option in the category." Now you're designing for perception. You can measure it through consumer panel testing, brand tracking studies, or, like Mockingbird, category growth data.
"We need to reduce bounce rate on the pricing page by 15%." Now you're designing for retention. You can A/B test two layouts and see which one keeps people on the page.
"We need this packaging to communicate 'natural' and 'local' to a shopper scanning a supermarket shelf in under two seconds." Now you're designing for instant comprehension. You can test it with eye-tracking tools or attention prediction software before it ever reaches the shelf.
Every one of those is measurable. The tools exist. They've existed for years. The missing piece isn't technology—it's the habit of asking the right questions before the work starts.
As Massimo Vignelli, the man who designed the New York subway map, the American Airlines identity, and half the modernist canon, put it: "The first thing you need to make clear to a client is that you aren't there to answer his wants but to answer his needs." Wants are "make it modern." Needs are "make people understand what we do." One is decoration. The other is design.
How do you actually measure design effectiveness?
Once the brief defines the goal, the measurement toolkit is bigger than most designers realise:
A/B testing. Run two versions of a design. Different hero layouts, different headline treatments, different packaging concepts. Measure which one performs better against the defined goal. This isn't just for digital. You can A/B test print concepts through consumer panels, packaging through shelf simulation, and brand identities through controlled exposure studies.
Five-second testing. Show a design to users for five seconds, then ask what they remember. What did the company do? What was the main message? Where was the call to action? If they can't answer, the design didn't communicate. Tools like UsabilityHub (now Lyssna) make this accessible and fast.
Attention prediction. AI-powered tools like Attention Insight and Brainsight can now predict where a viewer's eyes will land on a design before it launches. Trained on millions of real eye-tracking data points, these tools generate heatmaps with 90–96% accuracy. You can test whether your CTA gets noticed, whether your brand mark holds attention, whether the visual hierarchy works—all before a single person sees the real thing.
Brand tracking and sentiment. Pre-and-post brand perception studies, Net Promoter Score shifts, social sentiment analysis, consumer panel research. These measure whether a design changed how people feel about a brand. Exactly the kind of goal that boutique brand studios work toward.
Commercial metrics. Revenue lift, market share change, retail listings growth, customer acquisition cost changes. These are the metrics the DBA Design Effectiveness Awards are built on. They take longer to materialise, but they're the ones that prove design's value to the people who sign the cheques.
Focus groups and user research. For print, packaging, environmental design, and brand identity, real human responses still matter. The key is structuring the research around the goals defined in the brief, not asking "do you like it?" but "what does this communicate to you?"
McKinsey tracked 300 publicly listed companies over five years for their Business Value of Design report. The design-led companies, the ones that measured design performance with the same rigour they applied to revenue and costs, saw 32% higher revenue growth and 56% higher total returns to shareholders. But here's the kicker: over 50% of the companies studied had no objective way to measure their design team's output. Not because measurement was impossible. Because nobody had built the habit.
Why does this matter more now than ever before?
Because AI is about to commoditise aesthetics.
Good-looking design is getting cheaper every month. AI can generate polished layouts, brand concepts, social media assets, packaging mockups, and website designs at a speed and cost that would have been unthinkable three years ago. The output quality improves every cycle. It's not there yet for everything—but the trajectory is clear.
If your value proposition as a studio is "we make things look good," you're entering a race against a machine that gets faster and cheaper every six months. That's a race to the bottom with no exit.
But here's what AI can't do—at least not yet. It can't define what success looks like. It can't sit in a briefing meeting and ask the questions that turn a mood board note into a measurable design goal. It can't understand the cultural context of a brand, the competitive dynamics of a category, or the emotional nuance of how a specific audience relates to a product.
It can, however, optimise. And it's already doing it.
AI-powered A/B testing tools can now compress optimisation cycles from weeks to hours. They generate design variations, test them against each other, allocate traffic to winners, and implement the results automatically. Attention prediction tools can evaluate a design's visual hierarchy before it launches. Conversion rate optimisation platforms can restructure a page layout based on real-time user behaviour data, without a designer touching anything.
Here's the uncomfortable question: if a designer creates a website without defining goals, and an AI system later optimises that website for conversions based on behavioural data—who actually designed the effective version? The designer who made it look good, or the algorithm that made it work?
If designers don't own the effectiveness conversation, the goals, the brief, the measurement framework, then AI will. Not because AI is smarter. Because AI doesn't forget to ask what the goal is. AI doesn't skip the brief. AI doesn't assume that looking good is the same as working.
What happens to studios that don't make this shift?
They become decorators. Expensive ones, for a while. Then cheaper ones, as AI closes the gap on visual quality. Then replaceable ones, as clients realise they can get "modern and clean" from a tool that costs a fraction of a studio's day rate.
The studios that survive—and thrive—are the ones that own the entire loop. Brief to design to measurement to iteration. They define the goals. They design for outcomes. They measure whether the design achieved what it was supposed to achieve. And they iterate based on evidence, not gut feel.
That's not a reduction of creativity. That's an expansion of it. When you know what a design needs to achieve, the creative constraints become sharper, the decisions become more intentional, and the work becomes more defensible. You can walk into a client presentation and say: "We tested this against three alternatives, and this version achieved 40% higher comprehension in a five-second test." Try replacing that with a prompt.
Good design has always been about choices. AI just gives you more to choose from. But choosing well—knowing which option serves the goal, which creative direction will shift perception, which layout will hold attention in the right place—that's the skill. And it starts with the brief.
How do you start building design effectiveness into your studio?
You don't need to hire a data scientist. You don't need to overhaul your process overnight. You need to change one habit: start every project by defining what success looks like before anyone opens a design tool.
Fix the brief. If a brief doesn't include a measurable goal, send it back. "Make it modern" isn't enough. "Position us as the premium option in the category" is. "Increase demo sign-ups by 20% over the next quarter" is. "Make first-time visitors understand what we do within five seconds" is. The goal doesn't have to be a conversion metric—it can be perception, comprehension, recognition, behaviour change. But it has to be something you can point to afterward and say: that worked, or that didn't.
Build measurement into the process. Test before launch, not just after. Tools like Attention Insight plug directly into Figma. Five-second tests take minutes to set up. Consumer panel testing for packaging and brand work is more accessible than it's ever been. You don't need a research lab. You need the habit.
Train the whole team to think in outcomes. Not just the strategist. Not just the account director. Everyone. When a designer is making layout decisions, they should be thinking about visual hierarchy in terms of where the eye needs to go, not just what looks balanced. When a creative director is reviewing concepts, they should be asking "does this communicate the key message in two seconds?" not just "does this feel right?"
Use AI as a measurement accelerator. Predictive attention tools, automated A/B testing, conversion tracking. These aren't threats. They're the tools that make design effectiveness measurable faster and cheaper than ever before. The studios that adopt them will outperform the ones that don't. Not because AI replaces judgment, but because AI gives judgment better data to work with.
In the workshops we run with creative studios, this is consistently where teams get stuck. Not the AI tools—the measurement thinking. The brief. The goals. Once that's in place, AI becomes the accelerator that helps you test more, iterate faster, and prove the value of what you're producing. Without it, AI is just a faster way to produce work that nobody can prove is working.
The bottom line
Design effectiveness isn't new. The DBA has been awarding it since 1989. McKinsey has been tracking its business value for years. The studios winning in this space: B&B Studio, White Bear, Morrama, Else, Xfacta. They aren't using some secret methodology. They're asking better questions, defining clearer goals, and measuring whether their work achieved what it set out to achieve.
What's new is the urgency. AI is making good-looking design cheap and fast. The thing that won't be cheap is knowing what to aim for and designing for outcomes. The studios that build this thinking into their process, starting with the brief, running through measurement, closing the loop with iteration, are the ones that will remain irreplaceable.
The studios that don't? They'll keep making beautiful work. And they'll keep watching AI systems quietly optimise it into something that actually performs, without a designer in the room.
If you're figuring out how to build this thinking into your studio, how to get your whole team fluent with AI while grounding it in design effectiveness, I'm happy to talk it through.
Frequently Asked Questions
Can you really measure design effectiveness?
Yes. Design effectiveness has been measured for decades. The DBA Design Effectiveness Awards have been doing exactly this since 1989. The tools range from A/B testing and five-second user tests to brand tracking studies, attention prediction software, and commercial metrics like revenue lift and market share change. The key is defining a measurable goal in the brief before the design work begins.
What metrics should a design studio track?
It depends on the project's goals. For digital design: conversion rate, bounce rate, time on page, comprehension (via five-second tests), and attention distribution (via predictive heatmaps). For brand and identity work: brand recall, sentiment shift, category awareness, retail listings growth, and revenue change. For packaging: shelf standout, purchase intent, and consumer comprehension. The metric follows the goal. Define the goal first.
How is AI changing design measurement?
AI is compressing measurement cycles. Predictive attention tools like Attention Insight and Brainsight can evaluate a design's visual hierarchy before it launches, with 90–96% accuracy compared to real eye-tracking studies. AI-powered A/B testing tools generate variations, test them automatically, and implement winners, reducing optimisation from weeks to hours. The risk for designers is that if they don't define what "effective" means, AI systems will default to optimising for whatever metric they're pointed at.
What's the difference between design effectiveness and design quality?
Design quality is subjective—it's about craft, taste, composition, and aesthetic judgment. Design effectiveness is measurable—it's about whether the design achieved the goal it was created for. The best work has both. A beautifully designed website that nobody understands has quality but not effectiveness. A high-converting landing page that looks generic has effectiveness but not quality. The studios that win have both. And they start with a brief that demands both.
How do I write a better creative brief?
Start with the outcome, not the aesthetic. Instead of "make it modern and clean," write "position this brand as the premium option in the category" or "increase demo sign-ups by 20% in the next quarter" or "first-time visitors should understand what we do within five seconds." Define what success looks like before the design work begins—then you have something to measure against. A good brief gives the design team creative freedom within strategic constraints, not a mood board with no target.
What is the DBA Design Effectiveness Awards?
The DBA Design Effectiveness Awards, run by the UK's Design Business Association since 1989, recognise design work that has had a tangible, measurable impact on business and society. Unlike traditional design awards that judge on aesthetic merit, these are judged by business leaders and require entries to present actual evidence of impact: facts, figures, and results. They're considered the most rigorous standard for measuring design effectiveness globally.