I am one of the developers working on this with Amy and Cassie and the main issue with the images not showing is that we have NOT changed anything with our robots.txt prior to the issue starting to happen. Twitter previously happily read the share images and displayed with the same robots.txt as we have now. And we do not have similar issues with any other share service. Robots.txt validators online such as Googles show that our rules are fine for the images that we are trying to share.
So the issue is why does Twitter not like the robots.txt rules we have? We have not gotten an answer to that here so wondering how we can escalate this issue.
To summarize:
As we have been told that Twitter supports the Allow directive for robots.txt, our current set of directives that would be in context here should work:
User-agent: *
Allow: /sites/default/files/styles/*
Disallow: /sites/
For an example URL such as https://www.commonsensemedia.org/sites/default/files/styles/share_link_image_large/public/blog/csm-blog/2016-08-10-10tvkids-blog-1138x658.jpg
the above rules should work.
In addition last friday I also made a change to try and test the same above rules but this time specifically for Twitterbot:
User-agent: Twitterbot
Allow: /sites/default/files/styles/*
Disallow: /sites/
User-agent: *
Disallow: /
Using the Twitter dev test tool for cards, the image still does not show (using dev.commonsensemedia.org domain).
So we are still were we started with this ticket which is waiting on Twitter to tell us why the rules we have are blocking Twitter from fetching images even though by scan of the rules themselves, the images should NOT be blocked from fetching by Twitterbot?