"ERROR: Fetching the page failed because it's denied by robots.txt."

whitelisting
robots

#1

We have absolute sure that our website allow robots.txt to verify the url:
http://rfbd.cm/rp88d507b8

However, twitter said denied by rotbots.txt…

But in last week, it works…

Can anyone or twitter to check what happens?


#2

The error message I’m seeing is that your site doesn’t have any cards markup / there are no metatags.


#3

Hi Andypiper,

what I saw is this


#4

Weird, because I’m seeing:

INFO:  Page fetched successfully
WARN:  No metatags found
WARN:  this card is redirected to https://www.referboard.com/rp88d507b8

#5

Saying that, your robots.txt file is disallowing all robots from accessing / which thus overrides all of the other rules in your file.


#6

we try following rules, is that can be compatible with twitter?

Allow: /rp*
Allow: /robots.txt
Disallow: /dev/
Disallow: /


#7

The last line Disallow: / basically says, don’t allow any robots to access anything below / which is the root level of your site. So that stops any of the other rules from having any effect at all.


#8

ok, but why other robot engine can accept ours?

We have already get rid of “Disallow: /” to be particular directories.

How long does twitter will re-cache the robots.txt?


#9

We don’t cache the robots file. I just checked your original shortened / redirected URL, and now you’ve fixed the robots rules, it looks like the validator is showing a card.


#10

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.