Hello,
I have a URL that’s whitelisted for Twitter Cards - All URLs display the card on Twitter, unless they end with: ?__twitter_impression=true
If any URL’s end in /?__twitter_impression=true then I get the following error message:
ERROR: Fetching the page failed because it’s denied by robots.txt.
My robots.txt has the following:
User-agent: Twitterbot
Disallow:
Allow: /?
But it still doesn’t accept any URL’s with an extra query parameter.
Do you know why this is?
Cheers!