Doing some testing, there seems to be an issue as soon as the highest bit in the byte sent is set. I.e. values between
0 (decimal; 00000000 as 8 bit binary value)
127 (decimal; 01111111 as 8 bit binary value)
are sent correctly, but as soon as the eigth bit is set, i.e. from
128 (decimal; 10000000 as 8 bit binary value)
on, things get weird, although I'm not yet sure why. Would that correspond to what you're seeing or are you getting something else?