c# Bits order in byte -


hello i'm trying understand how or set bit , i'm stuck in bit order. let's have number 70 01000110. want change first bit true becomes 11000110 198. don't understand or i'm confused methods found.

public static void set(ref byte abyte, int pos, bool value) {     if (value)     {         //left-shift 1, bitwise or         abyte = (byte)(abyte | (1 << pos));     }     else     {         //left-shift 1, take complement, bitwise ,         abyte = (byte)(abyte & ~(1 << pos));     } }  public static bool get(byte abyte, int pos) {     //left-shift 1, bitwise and, check non-zero     return ((abyte & (1 << pos)) != 0); } 

in these methods when want change first bit have pass position 7 guess index of last of 8 bits. why that? why first bit in byte changed index of last?

why first bit in byte changed index of last?

basically, bits referred such least-significant bit bit 0, next bit 1 etc. example:

bit:   76543210 value: 01000110 

so byte value 70 (decimal) has bits 1, 2 , 6 set. because write down byte significant bit first doesn't mean regard "first bit". (indeed, i'd talk being "most significant bit" or "high bit" instead of using "first" @ all.)

the thing scheme means same value bit long value - bit 0 always "worth" 1, bit 1 always worth 2, bit 7 always 128 etc.

now, none of affects code, doesn't care call things, cares values. fortunately, naming convention helps here, too. need shift value of 1 (which "just bit 0 set") left pos bits bit pos. example, bit 5, shift 1 left 5 bits 100000.

if think value of 1 full byte (00000001) may become clearer:

00000001 << 0 = 00000001 00000001 << 1 = 00000010 00000001 << 2 = 00000100 00000001 << 3 = 00001000 00000001 << 4 = 00010000 00000001 << 5 = 00100000 00000001 << 6 = 01000000 00000001 << 7 = 10000000 

Comments

Popular posts from this blog

Android : Making Listview full screen -

javascript - Parse JSON from the body of the POST -

javascript - Chrome Extension: Interacting with iframe embedded within popup -