Card Validator


I previewed the following URL in card validator but it keeps showing error “ERROR: Fetching the page failed because it’s denied by robots.txt.”


I’ve already updated my robots.txt last Friday to allow the URL and it works fine on other testers. But I still get the same error message now… Is it a cache problems or other thing else?


The problem seems being fixed after i added

User-agent: Twitterbot

to my robots.txt, but still no idea why it didn’t work
This is the previous version

User-Agent: MJ12bot
Disallow: /
User-Agent: Ahrefs
Disallow: /
User-Agent: WebMeUp
Disallow: /
User-Agent: Yandex
Disallow: /
User-Agent: baidu
Disallow: /
User-Agent: SemrushBot
Disallow: /

User-Agent: *
Allow: /
Disallow: /api/
Disallow: /application/
Allow: /application/lp/step/ah/
Disallow: /search/
Disallow: /sp/
Disallow: /*/list$
Disallow: /*/list?

closed #3