Denied by robots.txt


The card validator appears to now obey robots.txt guidelines.This is an unhelpful change while working on a development environment to verify and validate changes. Is it possible to turn this off for card previews being accessed through the developer tools?

Screenshot of what I see:


Our crawler respects robots.txt, as outlined in the “URL Crawling” section of [node:15977]. The Validator Tool validates that your content is configured correctly and that it is accessible by the crawler.


What needs to be added to robots.txt to permit Twitter’s crawler?


Never mind; I just realized it’s in the file you linked. Sorry!


Glad it helped!


what kind of help you got , please share with others , i see many post , it helped , it helped , i never saw how ? please share


Did you check the troubleshooting faq? Did you check your robots.txt file?