Platforms like Google Ads and Microsoft Advertising now use AI to test and serve ad creative automatically. Responsive Search Ads (RSAs) and Performance Max campaigns mix and match headlines and descriptions in real time.
So does that mean manual A/B testing is dead? Not quite. But it has changed.
What AI Does Well
Automation is designed to optimise delivery based on predicted performance. RSAs automatically test combinations. Performance Max serves different creative sets across formats and audiences.
- Google uses live auction-time signals to choose creative
- Microsoft’s AI personalises ads based on user intent and placement
- Both platforms run continuous testing behind the scenes
This can outperform manual A/B tests in speed and scale. But it doesn’t tell you why one message works better than another, and this needs an experienced marketer to interpret.
What You Lose Without Manual Testing
AI testing happens behind a curtain. You don’t get full visibility into which exact combinations worked best or why.
- Google Ads only shows “Low,” “Good,” or “Best” performance labels
- No clear win/loss data or statistical confidence
- Testing variables like tone or value proposition becomes difficult
This is fine if your goal is raw performance. But if you want to learn what actually resonates with your audience, AI alone won’t give you the insight.

When Manual Testing Still Matters
There are still strong use cases for controlled A/B testing, even in AI-driven environments:
- Message positioning: Test different value propositions
- Visual direction: Compare lifestyle vs product-focused creatives
- Offer framing: Run a headline with and without pricing cues
You can do this by separating RSAs into different ad groups or running split campaigns, then comparing conversion outcomes at the ad group or campaign level.
How to Test Inside an Automated System
Manual A/B testing and automation can work together if you plan carefully:
- Limit variables per test, only change one thing at a time
- Use campaign experiments in Google Ads to split traffic
- Allow enough time and volume for meaningful comparison
- Don’t rely on platform labels alone to judge creative performance
AI should handle the delivery. You should still define the test structure and interpret the results.
Aim of A/B testing is to learn, not just optimise
A/B testing is about learning what matters to your audience so you can create better inputs for the system.
If you skip testing altogether, you become dependent on outputs you don’t fully understand. The platforms may optimise for performance, but your team miss out on insight.
ExtraDigital use A/B testing to better understand what is important for your audience. This knowledge is fed straight back to produce more effective campaigns. Contact us for a review of how effectively your PPC campaigns are making use of A/B testing and Google Ads AI features
Frequently Asked Questions
Is A/B testing still relevant in an AI-driven PPC environment?
Yes. While AI platforms automatically test and optimise ad delivery, manual A/B testing is still essential for understanding why certain messages perform better and what truly resonates with your audience.
What does AI testing do better than manual A/B testing?
AI excels at speed and scale. Platforms like Google Ads and Microsoft Advertising test creative combinations in real time using auction signals and user intent, often optimising faster than manual tests.
What insights are lost when relying only on AI optimisation?
AI provides limited transparency. You don’t get clear win/loss data, statistical confidence, or detailed insight into which messaging elements drove performance, making it harder to refine strategy. ExtraDigital helps uncover these insights through structured testing.
When should advertisers still run manual A/B tests?
Manual testing is valuable when testing message positioning, creative direction, or offer framing. Controlled tests allow advertisers to isolate variables and learn what truly influences conversions.
How should A/B testing work alongside automated PPC campaigns?
The most effective approach is to combine both. ExtraDigital uses A/B testing to generate insights, then feeds those learnings into AI-driven campaigns to improve performance and decision-making.











