ABSTRACT

There are some bounds that are placed on the size of error correcting codes. Some of the well-known bounds are discussed in the following sections.

6.1 Definitions

Let n be fixed, and let M denote the size of a code C (i.e., the number of codewords in C), and d the minimum distance between the codewords. For example, (i) for the code C = {00, 01, 10, 11} : n = 2,M = 4; d = 1; (ii) for the code C = {000, 011, 101, 110} : n = 3,M = 4, d = 2; and (iii) for the code C = {00000, 00111, 11111} : n = 5,M = 3, d = 2. Then for a q-ary (n,M, d) code, the transmission rate (also known as the information

rate) is defined by R(C) = log

q M

n , and the relative minimum distance by

δ(C) = (d − 1)/n. Note that this distance is also defined by d/n, but neater formulas are obtained by defining it as (d − 1)/n. For the binary repetition code C = {00 . . .0︸ ︷︷ ︸

11 . . .1︸ ︷︷ ︸ n

}, which is an (n, 2, n) code, we find that

R(C) = log

2 2

n =

n → 0 and δ(C) =

n− 1

n → 0 as n→∞.