WARN: Not whitelisted | Unable to render Card preview



My phpBB forum at www.todoperros.com is having problems sharing on twitter. It looks horrible, with cards not being rendered.
It used to work fine. I can’t understand what is happening.

This is an example url:

Thanks for your help!


Did you already go through the steps in the troubleshooting FAQ?


The only issue I can see with your site is that the front page has no twitter:image tag. Other pages render summary large image cards fine in the final Tweet. You won’t see the cards in the Tweet composer window, and that has always been the case.


I’m trying to validate http://www.solacecrafting.com/
I’ve tried going minimalist, to full bore, checked robots, htaccess, etc… It sees the root as having five meta tags, but says not whitelisted. If I input /index.html directly it says no meta tags found…?


Your robots.txt (and possibly also your htaccess) is blocking all user-agents from accessing the root of your domain. When I try to fetch it using curl -A Twitterbot http://www.solacecrafting.com/ I get a 403 error. This will not work. See the troubleshooting post for more information.


Thanks for the reply. Where can we use the curl command to check for ourselves? that would be a great addition to the sticky FAQ. I have a robot.txt on my root, but I just found one looking through my whole server in an etc folder that must be overriding the root file I’ll try to change that.

update: googlebot is not having any trouble crawling my site.
my robots.txt is:
User-agent: *
my htacces only had
RewriteCond %{HTTP_USER_AGENT} libwww-perl.*
RewriteRule .* ? [F,L]
in regards to blocking anything, but removing it didn’t help

INFO: Page fetched successfully
INFO: 5 metatags were found
WARN: Not whitelisted

(appending index.html directly was posting a preview in the forums)
INFO: Page fetched successfully
WARN: No metatags found


I ran the curl command at the command-line myself - I’m on a Mac, so I don’t know whether curl is pre-installed on whatever platform you’re using.


I’m showing all agents allowed on google, and many other seo tool sites, including twitterbot
URL: http://www.solacecrafting.com/
User Agent: Twitterbot
Result: Allowed
Why would curl show different? I’m confused =o

edit: this tool
shows approved, then 403…??

edit: facebook opengraph is debugging fine…?


It is definitely something to do with the way your web server is responding to the Twitterbot user-agent. I don’t know why, but if I request the page without specifying a user-agent string, the server works fine; specifying Twitterbot results in a 403.


Hmmm, thank you for looking at it for me Andy.I will contact my hosting provider see if they know anything.

closed #11

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.