Card Validator


#1

I previewed the following URL in card validator but it keeps showing error “ERROR: Fetching the page failed because it’s denied by robots.txt.”

URL: https://www-acc.hoikushibank.com/application/lp/step/ah/register

I’ve already updated my robots.txt last Friday to allow the URL and it works fine on other testers. But I still get the same error message now… Is it a cache problems or other thing else?


#2

The problem seems being fixed after i added

User-agent: Twitterbot
Disallow:

to my robots.txt, but still no idea why it didn’t work
This is the previous version

User-Agent: MJ12bot
Disallow: /
User-Agent: Ahrefs
Disallow: /
User-Agent: WebMeUp
Disallow: /
User-Agent: Yandex
Disallow: /
User-Agent: baidu
Disallow: /
User-Agent: SemrushBot
Disallow: /

User-Agent: *
Allow: /
Disallow: /api/
Disallow: /application/
Allow: /application/lp/step/ah/
Disallow: /search/
Disallow: /sp/
Disallow: /*/list$
Disallow: /*/list?

#3