"Fetching the page failed because it's denied by robots.txt." error message


When I try to check the validation of my twitter cards, I am met with this message. I use tumblr with a custom domain for my website.

When I google thegirlgaze.net/robots.txt is shows me:

User-agent: *
Disallow: /

https://thegirlgaze.net/post/169846108148/new-year-not-a-new-me is the post I am trying to create a twitter card for.



Your robots.txt is set to block all crawlers. you can enable it by changing the file to:

User-agent: *
Allow: /


Okay, thank you. Do you have any suggestions about how I can do this?


all you need to do is to edit the robots.txt file located in the root directory of your site.


Okay, but I have no idea where to find the root folder of my site.


It sounds like you may need to ask your web host for assistance. We’re not able to walk you through these kinds of issues as they are specific to how your site is hosted.


This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.