ABSTRACT

The electronic components of modern computers function only in two possible states of either high or low voltages. Denoting high voltages by 1 and low voltages by 0 leads to the entire set of binary digits and, as a result, the binary number system was adopted long ago to handle all aspects of computer hardware design and software development. Collectively, these binary digits are called bits and can be grouped together into groups of 4 (nibble), 8 (byte), 16 (word), 32 (dword) or even 64 (qword) bits. Generally speaking, a word is a term for the natural unit of data used by a particular computer design. A word is simply a fixed-sized group of bits that are handled together by the machine. The number of bits in a word (the word size or word length) is an important characteristic of a computer architecture. Modern computers usually have a word size of 16, 32, or 64 bits. In any event, it would be useful to develop some routines in C# to convert numerical values back and forth between all these different numerical bases.