We have researched the timeout error and found that this error is reproduced only for linkis.com domain, while the other domain of ours doesn’t receive that error.
We tested both domains with links that refer to the same file on the same server:
http://check.ln.is/check.html - processed by Crawler correctly
http://check.linkis.com/check.html - processed by Crawler with a timeout error.
When testing linkis.com we found that the request from the validator does not reach our servers at all. No data on it was detected neither in access-log of a web server, nor in a tcpdump.
What are the possible reasons of this?
Now TwitterBot visited a large number of pages on our linkis.com domain (about 1500 per minute). And each request from the bot is processed very quickly (about 1 second from the moment a tweet was posted on twitter.com). Could it be that this number of visits from the bot is the reason of a timeout error?