I am really interested in having a response to this as well. I look at the metadata of @davedawson who started this thread and I don’t see anything wrong with it. I read a lot of the other responses to the same problem in the last few hours and nothing seems to apply to me.
The page that I am trying to validate ( duo.pds1.lvlpurple.com:6001/episodeGuide/tv5:100318984 ) is a development site running a NodeJS webserver.
I read this post: https://dev.twitter.com/docs/cards/troubleshooting#When_I_validate_I_get_the_message_exceeded_15.seconds_to_pink-floyd_while_waiting_for_a_response_for_the_request_including_retries_if_applicable
- I don’t have any robots.txt, so I expect I don’t dissallow anything.
- I don’t run Apache so I don’t have a .htaccess.
- I even removed the image from the metadata ( because it was not a ‘required’ field, I thought I could do without ) so I certainly not reach the limit of 2MB.
- My server is hosted on Amazon so it should be pretty easy to access and if the Twitter crawler is in the US it should be pretty quick too.
- If Amazon was blocking access by Twitter, I am certain I would have found a blog post about it.
So, having answered the list of possible explanation listed in the above-mentioned post, I am stuck and not sure what to so.
The Facebook crawler has no trouble accessing the page and as such I don’t think the Twitter crawler would have a problem. In fact, as soon as I enter my URL in the their Facebook developer’s tool, I see the access by their crawler in my webpage… But nothing when I use the Twitter validation tool. So, I expect this is why I get the 15-second timeout… but that does not help me solve the problem.
@davedawson: Hopefully, Twitter is going to tell us there was an outage of service in the last few hours… and I hope this is going to solve both our problems.