Not sure you are doing anything wrong. As I mentioned in my post, on my Generation 1 and 1.53, using analogReadResolution(12) inside of script gives me the full 12 bit resolution of the A2D. I have tested this out using a variety of sensors and am absolutely convinced that is the case. If you don't set it at 12 bits, you get the default of 10 bits...and, as I wrote in my previous post, setting it to 16 bits does not get you true 16 bit resolution but will pad the 4 lsb bits with 0s.
Change the code to Serial.print(analogRead(Pin), BIN); makes it easy to see this. The screen shot below was taken a few minutes ago after connecting +5 to A0.
Looks like you are getting 10 bit resolution and 0 padding for the unused bits. Could this be a Gen 1 / Gen 2 difference? I don't know, sorry.