Calculating the signature (possible twitter mistake?)


#1

I am trying to create a signature and have been looking at the Calculating a Signature portion. https://dev.twitter.com/docs/auth/creating-signature

I used the given signing key and the given signature base strings. I put them into my HMAC-SHA1 function and I retrieved the same string that is shown in the example ‘B6 79 C0 AF 18 F4 E9 C5 87 AB 8E 20 0A CD 4E 48 A9 3F 8C B6’ (excluding the spaces)

I then encoded that encrypted string using a simple online encoder since I am double-checking my code. I used http://ostermiller.org/calc/encode.html

Signatures after encryption and encoding
online base64: YjY3OWMwYWYxOGY0ZTljNTg3YWI4ZTIwMGFjZDRlNDhhOTNmOGNiNg==
My base64: YjY3OWMwYWYxOGY0ZTljNTg3YWI4ZTIwMGFjZDRlNDhhOTNmOGNiNg==
Twitter’s base64: tnnArxj06cWHq44gCs1OSKk/jLY=

I was wondering how twitter is retrieving that value?


#2

+1 the same question


#3

Hi there, i just find the solution: