Calculating the signature (possible twitter mistake?)


I am trying to create a signature and have been looking at the Calculating a Signature portion.

I used the given signing key and the given signature base strings. I put them into my HMAC-SHA1 function and I retrieved the same string that is shown in the example ‘B6 79 C0 AF 18 F4 E9 C5 87 AB 8E 20 0A CD 4E 48 A9 3F 8C B6’ (excluding the spaces)

I then encoded that encrypted string using a simple online encoder since I am double-checking my code. I used

Signatures after encryption and encoding
Twitter’s base64: tnnArxj06cWHq44gCs1OSKk/jLY=

I was wondering how twitter is retrieving that value?


+1 the same question


Hi there, i just find the solution: