The Google Ads Audit Checklist Every Agency Needs
You’ve just onboarded a new client. They’ve given you access to their Google Ads account and said something along the lines of “our last agency wasn’t great, but we’re spending R80,000 a month and we think there’s room to improve.”
You open the account and find 47 campaigns, half of which are paused but still have active ad groups inside them. There are three different conversion actions all named “Purchase” with different attribution models. The search terms report shows 30% of spend going to irrelevant queries. And there’s a Display campaign that’s been running for nine months with a 0.02% CTR and no one has looked at it.
Welcome to the Google Ads audit. This is the checklist we use — every item, in order, with the reasoning behind why it matters. (If you haven’t already, pair this with a GTM audit — bad tag management is the root cause of most conversion tracking problems.)
Before You Touch Anything
Before changing a single setting, do two things:
Take screenshots. Every campaign’s settings page, every conversion action, every bid strategy. If the client later asks “what did you change?”, you need documentation of the before state. Google Ads’ change history helps, but it doesn’t capture everything — especially settings that were already wrong when you arrived.
Check the change history. Go to the change history for the last 90 days. Look for patterns: frequent bid strategy changes (a sign of panic optimisation), bulk uploads (could indicate automated rules gone wrong), or long periods of zero changes (neglect). This tells you what kind of account you’re dealing with before you start the audit.
Account Structure
Campaign Organisation
Open the campaign list and answer these questions:
-
Are campaigns organised by business objective? Brand campaigns separate from generic. Shopping separate from search. Remarketing clearly labelled. If everything is in one campaign called “All Products,” that’s your first problem.
-
Is there a clear naming convention? You should be able to tell what a campaign does from its name. “[Brand] - Search - Exact - Desktop” tells you everything. “Campaign 1 (copy)” tells you nothing.
-
Are paused campaigns actually paused? Check inside paused campaigns for active ad groups and ads. Google pausing a campaign doesn’t pause the entities inside it — when someone accidentally unpauses the campaign, everything inside it goes live immediately with whatever settings were last configured.
-
Geographic targeting. Check every campaign. The number of times I’ve seen a South Africa-only business targeting “All countries and territories” is genuinely alarming. Also check for “Presence or interest” vs “Presence” targeting — the default “Presence or interest” means Google will show your ads to people who’ve shown interest in your location but aren’t physically there. For most local businesses, that’s wasted spend.
Ad Group Structure
-
Themed ad groups. Each ad group should have a clear keyword theme. If an ad group has more than 20 keywords covering multiple topics, it needs splitting. Tightly themed ad groups mean more relevant ads, which means higher quality scores, which means lower CPCs.
-
Single keyword ad groups (SKAGs). These were best practice in 2019. In 2026 with broad match and smart bidding, they’re usually overkill. If the account is running hundreds of SKAGs, you might get better performance by consolidating into themed groups and letting Google’s AI do the matching.
-
Ad group / keyword ratio. If there are ad groups with one keyword and ad groups with 200 keywords, the structure is inconsistent. Aim for 5-15 keywords per ad group as a general guideline.
Conversion Tracking
This is the most important section of the audit. Everything else — bidding, budgets, targeting — depends on accurate conversion tracking. If conversions are wrong, every optimisation decision based on them is wrong.
Conversion Actions
-
List every conversion action. Go to Tools → Conversions. List them all. You’ll often find duplicates, test conversions that were never deleted, and actions that track meaningless events (like page views being counted as conversions).
-
Primary vs secondary. Only primary conversions are used for bidding. Check that the right actions are set to primary. A common mistake: having “Add to Cart” and “Purchase” both set to primary, which double-counts the funnel and confuses smart bidding.
-
Attribution model. Google defaulted everything to data-driven attribution in 2023. Check that this hasn’t been changed to last-click on some conversions and data-driven on others — inconsistency here makes cross-campaign comparison unreliable.
-
Conversion window. The default is 30 days for click-through, 1 day for view-through. This is reasonable for most businesses. If someone’s changed it to 90 days, ask why — it might be appropriate for long sales cycles, or it might be artificially inflating conversion numbers to make the account look better than it is.
-
Count. “Every” vs “One.” For e-commerce (purchases), use “Every.” For lead gen (form submissions), use “One.” Getting this wrong means either undercounting revenue or overcounting leads.
Tag Verification
-
Is the Google Ads tag firing correctly? Use Google Tag Assistant, the GTM preview mode, or your browser’s developer tools. Check that the tag fires on the correct pages and passes the right conversion value.
-
Enhanced conversions. If the client has enhanced conversions enabled, verify that first-party data (email, phone, name, address) is being hashed and sent correctly. Enhanced conversions can improve attribution by 5-15%, but only if implemented properly. A botched implementation is worse than none at all. Also check your Consent Mode V2 setup — without proper consent signals, enhanced conversions won’t fire for EEA or POPIA-compliant traffic.
-
Google Analytics 4 integration. Check whether conversions are being imported from GA4 into Google Ads. If they are, make sure they’re not also being tracked by a Google Ads tag — that’s how you get double-counted conversions.
Bidding Strategy
Current Strategy Assessment
-
What bid strategy is each campaign using? List them. You’ll often find a mix of manual CPC, maximise clicks, target CPA, target ROAS, and maximise conversions across the account — with no clear rationale for why each campaign uses a different strategy.
-
Is the strategy appropriate for the volume? Target CPA and target ROAS need conversion volume to work — roughly 30 conversions per month per campaign is the minimum. If a campaign gets 5 conversions a month and is running target ROAS, the algorithm doesn’t have enough data to optimise and you’re essentially running random bids.
-
Learning period. Check if any campaigns are stuck in “Learning” or “Learning (limited).” This usually means the target CPA/ROAS is unrealistic, the budget is too restrictive, or someone keeps making changes that reset the learning period.
Bid Strategy Alignment
-
Portfolio bid strategies vs campaign-level. If related campaigns share a business objective (e.g., all lead gen campaigns), a portfolio bid strategy can optimise across them more efficiently than individual campaign targets.
-
Target CPA/ROAS values. Are they realistic? Check the actual CPA/ROAS over the last 90 days and compare against the target. If the target CPA is R50 but actual CPA is R200, the bid strategy is fighting reality and losing. Start targets at or slightly above the current actual, then gradually tighten.
Budget Management
-
Budget capping. Check if campaigns are “Limited by budget.” This means Google has demand to show your ads to more relevant users but can’t because the daily budget is exhausted. It’s either an opportunity to increase spend profitably or a sign that bids are too high.
-
Shared budgets. Shared budgets can simplify management but reduce control. For high-priority campaigns, dedicated budgets are better — you don’t want your brand campaign stealing budget from your best-performing generic campaign.
-
Monthly pacing. Google Ads can spend up to 2x the daily budget on any given day (averaging out over the month). If the client has a strict monthly cap, ensure the daily budget accounts for this — set it to monthly budget ÷ 30.4, not monthly budget ÷ 30.
-
Wasted budget. Calculate the percentage of budget going to search terms that resulted in zero conversions over 90 days. In a neglected account, this is often 25-40%. That’s the immediate savings opportunity you can present to the client.
Keywords and Search Terms
Keyword Health
-
Match types. What’s the mix of broad, phrase, and exact match? In 2026, Google’s recommendation is broad match plus smart bidding. That works well for accounts with sufficient conversion volume. For smaller accounts or tight budgets, phrase and exact match give you more control.
-
Quality Score distribution. Pull quality scores for all active keywords. The distribution tells you about the account’s health:
- 7+ on most keywords: healthy account
- 4-6 on most keywords: needs ad relevance and landing page work
- Below 4 on many keywords: structural problems — poor ad group theming, irrelevant landing pages, or fundamentally wrong keywords
-
Keyword conflicts. Search for cases where the same keyword exists in multiple campaigns or ad groups. Internal competition cannibalises performance and confuses Google’s auction signals.
Search Terms Review
This is where the money is. Open the search terms report for the last 90 days and look for:
-
Irrelevant queries. Search terms that have nothing to do with the business. Add them as negative keywords immediately.
-
High-spend, zero-conversion terms. Sort by cost descending, filter for zero conversions. These are your biggest waste opportunities.
-
Competitors’ brand names. If the client is appearing for competitor searches, that might be intentional (conquest strategy) or accidental (broad match gone wild). Clarify with the client.
-
Negative keyword lists. Check if any exist. In many neglected accounts, there are zero negative keywords — which means every irrelevant query Google matched has been slowly draining the budget for months.
Ad Copy and Assets
-
Responsive Search Ads. Every ad group should have at least one RSA. Check ad strength — “Poor” and “Average” ads need headline and description improvements. But don’t obsess over “Excellent” — ad strength is Google’s opinion of your ad diversity, not a performance metric.
-
Pin usage. Are headlines pinned to specific positions? Light pinning (pinning your best headline to position 1) is fine. Heavy pinning (every headline pinned) defeats the purpose of RSAs and limits Google’s ability to test combinations.
-
Ad extensions / assets. Check that sitelinks, callouts, structured snippets, call extensions, and location extensions are all set up. Each extension type gives your ad more real estate and can improve CTR by 10-15%. Missing extensions is free performance left on the table.
-
Landing page relevance. For each ad group, check that the final URL matches the keyword intent. Generic homepage URLs for specific product keywords is a common lazy mistake that tanks quality score and conversion rate simultaneously.
Display and Video Campaigns
These are often the neglected corners of a Google Ads account:
-
Placement reports. Check where Display ads are actually showing. Mobile app placements (especially accidental games placements) often burn budget with zero value. Exclude low-performing app categories in your campaign’s placement settings and remove any placements with high impressions but zero conversions.
-
Audience targeting. Display campaigns running on “Optimised targeting” with no seed audience are essentially running blind. Check that there’s a defined audience strategy — remarketing lists, custom segments, or in-market audiences at minimum.
-
YouTube campaigns. If running, check view rates and earned actions. YouTube campaigns with sub-10% view rates usually have a targeting or creative problem.
The Quick Wins Table
After completing the audit, I structure findings into a table the client can immediately understand:
| Finding | Impact | Effort | Priority |
|---|---|---|---|
| Add negative keywords from search term report | High — recover 20-30% wasted spend | Low — 1 hour | Do first |
| Fix geographic targeting | High — stop international spend | Low — 5 minutes | Do first |
| Consolidate duplicate conversion actions | High — fix bidding accuracy | Medium — 1 hour + verification | This week |
| Add missing ad extensions | Medium — improve CTR 10-15% | Low — 30 minutes | This week |
| Restructure ad groups for relevance | Medium — improve quality scores | High — ongoing | This month |
| Migrate bid strategies appropriately | Medium — improve efficiency | Medium — needs learning period | This month |
The audit isn’t the end — it’s the starting point for a roadmap. Present the findings, quantify the waste, and show the client exactly what you’re going to fix and in what order. That’s what turns an audit into a retained engagement.
Automating the Audit
A thorough Google Ads audit takes 4-8 hours per account when done manually. For an agency managing 20+ accounts, that’s a full work week just on audits — time that should be spent on optimisation.
This is where automated auditing tools earn their keep. An AI agent with API access to Google Ads can run through this entire checklist in minutes: pull the account structure, flag conversion tracking issues, identify wasted spend in search terms, check bid strategy alignment against conversion volume, and generate the quick wins table automatically.
The agent doesn’t replace the strategist — you still need human judgment to prioritise findings and communicate with the client. But it eliminates the 6 hours of manual data pulling and lets you focus on the analysis and recommendations that actually move the needle.
Whether you use Alethia, build your own scripts, or audit manually with this checklist — the important thing is that you audit systematically rather than spot-checking whatever catches your eye first. The biggest waste in a Google Ads account is almost never the thing that’s most visible. It’s the quiet, persistent bleed that’s been running for months because nobody looked at the search terms report or checked whether enhanced conversions were actually firing.