Consider the situation of a person recently infected with human immunodeficiency virus. On average, it is likely that 7 to 10 years will elapse before the infection progresses to AIDS. Thus, if we could treat infected individuals, as soon as they seroconvert, with a drug that gives 90% inhibition of viral replication, then the disease, while still incurable, should be staved off for at least 70 years. A number of drugs are available that reach and maintain an average IC% concentration in the blood. Since, unfortunately, we cannot sustain 70-year responses with these drugs, we must ask where the fallacy is in our original argument. In fact, control of AIDS (and of many other hard-to-treat infections and tumors) is limited by two factors: cumulative drug toxicity and acquired drug resistance. The first of these factors means that there is a limit to how much drug we can administer, and for how long; the second factor means that the disease anyway ceases to respond after a while. Drug companies that are in the business of developing antibiotics are familiar with the resistance problem, and one of their rules of thumb (heuristics) says that you have, on average, 8 to 10 years to recoup development costs before drug resistance markedly limits the usefulness of your new antibiotic (at least against serious hospital-transmitted infections). When we now consider that the mutation rate of HIV is about 100-fold greater than for Staphyllococcus aureus, this suggests that an AIDS drug will have a useful life of about 5 weeks (assuming the relationship is a linear one). No company is likely to spend $100 million developing a drug with such a short product cycle, and some companies have already allowed this argument to keep them out of the AIDS drug development arena. The companies that have remained in the area presumably do not buy this argument, but worry about it nevertheless. This argument is revisited (with nonlinear mathematics) below.