Card validator: ERROR: Fetching the page failed because it's denied by robots.txt


#1

Using wordpress and for some reason lately the card validator shows the following Error:

Fetching the page failed because it’s denied by robots.txt.

URL example: https://ethletic.com/2018/07/festivalglimmer-ohne-mikroplastik/

What could be wrong?


#2

The error message tells you what is wrong.

fetching https://ethletic.com/robots.txt:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Disallow: /kontakt/haftung-copyright-datenschutz/
Disallow: /category/
Disallow: /filter/
Disallow: /page/
Disallow: /tag/
Disallow: /201*

Sitemap: http://ethletic.com/sitemap.xml

This says, for User-agent: * (i.e. all crawlers) Disallow: /201*
The page URL is /2018/07… etc so all crawlers are blocked.

This is covered in thetroubleshooting documentation on the developer site (search for “robots.txt”), and in the pinned troubleshooting post that you should already have checked before posting.

If you’re having issues with your site you may need to discuss with the webmaster for assistance in configuring the robots file.


closed #3

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.