I wanted to see if you can pass along to your developers this question:
This appears NOT to work despite it being a standard set of robots.txt directives:
User-agent: Twitterbot
Allow: /sites/default/files/styles/*
Disallow: /sites/
User-agent: *
Disallow: /
This DOES appear to work despite it allowing Twitterbot access to everything on our site which we clearly do not want:
User-agent: Twitterbot
Disallow:
User-agent: *
Disallow: /
The above are testing robots.txt and not on our production where our production robots.txt has many more Disallow directives to tag a number of paths as disallowed. We would like to be able to keep our current Allow path followed by Disallow path as per the first entry above since it keeps the same block for all robots AND since it is what we’ve had all along and that USED to work just fine with Twitter and still works fine with every other popular share tool (Facebook, G+, Pinterest, etc).