WARN: Not whitelisted | Unable to render Card preview


#1

hi,

My phpBB forum at www.todoperros.com is having problems sharing on twitter. It looks horrible, with cards not being rendered.
It used to work fine. I can’t understand what is happening.

This is an example url:

Thanks for your help!


#2

Did you already go through the steps in the troubleshooting FAQ?


#3

The only issue I can see with your site is that the front page has no twitter:image tag. Other pages render summary large image cards fine in the final Tweet. You won’t see the cards in the Tweet composer window, and that has always been the case.


#4

I’m trying to validate http://www.solacecrafting.com/
I’ve tried going minimalist, to full bore, checked robots, htaccess, etc… It sees the root as having five meta tags, but says not whitelisted. If I input /index.html directly it says no meta tags found…?


#5

Your robots.txt (and possibly also your htaccess) is blocking all user-agents from accessing the root of your domain. When I try to fetch it using curl -A Twitterbot http://www.solacecrafting.com/ I get a 403 error. This will not work. See the troubleshooting post for more information.


#6

Thanks for the reply. Where can we use the curl command to check for ourselves? that would be a great addition to the sticky FAQ. I have a robot.txt on my root, but I just found one looking through my whole server in an etc folder that must be overriding the root file I’ll try to change that.

update: googlebot is not having any trouble crawling my site.
my robots.txt is:
User-agent: *
Disallow:
my htacces only had
RewriteCond %{HTTP_USER_AGENT} libwww-perl.*
RewriteRule .* ? [F,L]
in regards to blocking anything, but removing it didn’t help


INFO: Page fetched successfully
INFO: 5 metatags were found
WARN: Not whitelisted

http://www.solacecrafting.com(slash)index.html
(appending index.html directly was posting a preview in the forums)
INFO: Page fetched successfully
WARN: No metatags found


#7

I ran the curl command at the command-line myself - I’m on a Mac, so I don’t know whether curl is pre-installed on whatever platform you’re using.


#8

I’m showing all agents allowed on google, and many other seo tool sites, including twitterbot
URL: http://www.solacecrafting.com/
User Agent: Twitterbot
Result: Allowed
Why would curl show different? I’m confused =o

edit: this tool
https://technicalseo.com(slash)seo-tools/robots-txt/
shows approved, then 403…??

edit: facebook opengraph is debugging fine…?


#9

It is definitely something to do with the way your web server is responding to the Twitterbot user-agent. I don’t know why, but if I request the page without specifying a user-agent string, the server works fine; specifying Twitterbot results in a 403.


#10

Hmmm, thank you for looking at it for me Andy.I will contact my hosting provider see if they know anything.


#11

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.