So I just checked again, and this doesn’t seem to be related to the user-agent string - it just doesn’t like responding to curl (and our internal crawler is doing basically the same thing and getting the same response). If it helps, the page returned contains references to cdn.distilnetworks.com which I assume is a CDN which fronts your site, and contains a form to fill in to get whitelisted.
My hypothesis is that the working URL was cached at some point in the past week, potentially before this anti-robot measure (or whatever is now in place) was working?
You may also want to check that Twitter’s IP ranges are not blacklisted somehow. We recently changed these - see Update to Twitter outbound IP configuration (may affect Cards)