Google provides translated versions of the Advertising Policies Help Center, though they're not meant to change the content of the policies. The English version is the official language used to enforce Google Ads policies. To view this article in a different language, use the language dropdown at the bottom of the page.
Google prioritizes user experience across all its products, and a key part of this is fostering a safe and trustworthy environment within the Google ad network. The Destination Requirements policy strives to ensure that when users click on an ad and are sent to a landing page, that website is functional, useful, and easy to navigate. This also creates an ads ecosystem thatโs supportive to advertisers and the people who interact with their ads.
Learn more about the Destination requirements policy.
In this article
Google requires that your ad destination and contents are crawlable by Google AdsBot web crawlers to verify that users are led to an ad destination thatโs relevant to the ad theyโve clicked. Ensure that Google can crawl your site effectively by using a URL structure that follows URL Structure Best Practices for Google Search.
The following would lead to disapproval for Destination not crawlable:
Destinations that aren't crawlable by Google Ads
Examples (non-exhaustive):
- Using exclusion files like "robots.txt" to restrict access to the majority or entirety of a site
- Using site settings that allow less crawling than is required for the number of ads youโre running
Note: Even if youโre not blocking Google Ads from crawling your content, you might be unintentionally limiting efficient crawls. This is particularly likely if you have recently submitted a large volume of ads to Google. If you use a click tracker for your ads, check if that might be affecting the crawl capacity. If your website doesnโt have sufficient crawl capacity, consider breaking up your ad submissions into smaller batches spread across several days.
About robots.txt files and related disapprovals
A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. Learn more about How Google Interprets the robots.txt Specification.
Common reasons for โDestination not crawlable due to robots.txtโ include:
- Server's robots.txt disallows access: Your robots.txt file doesnโt allow crawl access, so Google Ads couldnโt crawl your site.
- Server's robots.txt unreachable or Timeouts reading robots.txt: Google Ads couldnโt crawl your site because it couldnโt read your robots.txt file.
Learn more about how to Submit Updated Robots.txt.
Options to fix
If this policy is affecting your ad, review your options to fix below.
Allow Google AdsBot web crawlers to access your ad destinations
- Make sure youโre not using exclusion files like "robots.txt" to restrict access to the majority or entirety of your site. You can use Google Search Console to verify that your pages are accessible. You can also check for crawl errors or a low crawl rate.
- If you use a click tracker for your ads, make sure that itโs not affecting the crawl capacity.
- If you canโt resolve the issue yourself, work with your web developer to make your app or website accessible to the Google AdsBot web crawlers.
Choose a different destination
Update your ads to use a different destination that is compliant with this policy.
Edit your ads to comply with this policy
- Go to Ads within the Campaigns
menu.
- Hover over the ad or asset and select Edit.
- Edit the ad or asset so that it complies with the policy.
- Select Save.
Your ad will be automatically reviewed again. Check the adโs status in the โAds & assetsโ page for updates.
Learn more about how to Fix an ad with policy violations.
Appeal policy decision
If you believe thereโs been an error and that you havenโt violated Google Ads' policies, appeal the policy decision directly from your Google Ads account to request a review. If the review determines that your ads are compliant, they can run again. If you aren't able to fix these violations or choose not to, remove your ad to help prevent your account from becoming suspended in the future for repeated policy violations.