Twitter Card Error "ERROR: Fetching the page failed because other errors."



Bill, what changes did you make?


Hi. GoDaddy suggested that we add a plugin "really simple SSL’. I did that but as far as I can tell it did not make any difference. Just checked Twitter Card Validation on both of the problem sites and still same error. I had been waiting to see if it was a Twitter cach issue but suspect it is something to do with GoDaddy. The GoDaddy server where the two effected sites we have live is a Linix server. The SSL is: Encryption Strength - GoDaddy SHA-2
What is the host type and Encryption of yours?


I have Godaddy’s Standard SSL 256-bit encryption on Linux. I get a worse grade than your site though when I test at ( you get an A and I get a C grade).

Looks like because my server uses RC4 and doesn’t have Forward Secrecy. Yours doesn’t support RC4 and has Forward Secrecy. This would just affect clients though. Depends on what Twitterbot demands perhaps. I’m not sure.

Do you have the Premium SSL plan? Don’t know if it matters since so far neither work.

Looks like All Really Simple SSL does is to redirect http to https, which can already be done in your htacess file.


I can connect using cURL (however this is using tsl1).

curl -A “Twitterbot” -v -1

When I use openssl to connect, it works using tsl1 but I get a handshake failure when using ssl2 and ssl3. Note, when using tsl1, it’s using the less secure RC4.

openssl s_client -connect -tls1


Wow. Thanks Ralph, good detecting. Have you talked to GoDaddy about this? I am thinking that we need Andy’s help on Monday to figure out what is going on from the Twitter perspective.
You are absolutely right the Really Simple SSL plugin, while useful, did not do anything I had not already done manually when we set the site up about a year ago.
I did use it on a new site and was pleased with it.


Currently I’m not sure what else I can add beyond what I’ve mentioned, which is what the internal log reports from the crawler.


This worked for me. I added this to my .htaccess file. It redirects http to https except for Twitterbot. The card URL should use http instead of https.

RewriteEngine On
RewriteCond %{HTTPS} !^on$
RewriteCond %{HTTP_USER_AGENT} !Twitterbot [NC]
RewriteRule (.*)$1 [R,L]


Thanks Ralph. Glad you got yours working. I will try this on our site.


Im having the same problem, and the log only show minimal info “ERROR: Fetching the page failed because other errors.”. I tried @ralphsnart .htaccess solution, but no success. The Url: @andypiper can you take a look at the log and give me some more info?



Make sure you’re not blocking Twitterbot. Add this to the end of robot.txt…

User-agent: Twitterbot

Also, your validating url and image meta url needs to be http and not https.


handshake alert: unrecognized_name at remote address which almost certainly means your web server’s SSL settings are not correct. There are a few threads like this one and this one which refer to ways to resolve SSL issues so that strictly-configured Java-based code can work with them.


thanks @andypiper that helped, I reconfigured my settings and resolved the issue.


SSL was not helpful for me, please check my application based on django-python framework.

andypiper please help in this.


I also get handshake alert: unrecognized_name at remote address from an SSL connection to your site, so I believe this is likely to be a similar issue. Your certificate has a wildcard name of subject=/CN=* which is not an exact domain match, per the other threads I linked above, try modifying your ServerName and ServerAlias appropriately.

Also, please don’t tag individual members of Twitter Staff for help, since we may be unavailable from time to time.

Unable to render Card preview

andypiper Can you help to check my domain also :
I recive the same error message,.


Same issue, so I suggest you check the other threads I linked above for ideas to help to resolve the problem with your site or server’s SSL setup.


@andypiper I’ve got the same issue with our web app. Currently there the SSL/HTTPS is disabled. When I copy the input of our crawler-page onto another webserver and serve it as a static file, it works fine.

When I run the card validator there’s no request logged.
The page is running on port 8080. Maybe that’s a reason, don’t know.
The server is listening for a user agent matching /twitterbot/gi and serve the related static crawler-page.


If you’re able to run it without an explicit port I suspect that may help.


@andypiper thanks for the fast reply! works fine :slight_smile: