Pharmaceutical Century Part Two

Published on December 2016 | Categories: Documents | Downloads: 85 | Comments: 0 | Views: 774
of 29
Download PDF   Embed   Report

Comments

Content

Chemistry, cancer & ecology
Introduction As the 1970s opened, new chemistries and the war on cancer seized center stage. U.S. President
Richard Nixon (taking a moment of from his pursuit of the Vietnam War) established the National Cancer Program, popularly known as the war on cancer, with an initial half-billion dollars of new funding. Carcinogens were one of the concerns in the controversy surrounding the polluted Love Canal. And cancer was especially prominent in the emotional issue of the “DES daughters”—women at risk for cancer solely because of diethylstilbestrol (DES), the medication prescribed to their mothers during pregnancy. New cancer treatments were developed; chemotherapy joined the ranks of routine treatments, especially for breast cancer. New drugs appeared. Cyclosporin provided a long-sought breakthrough with its ability to prevent immune rejection of tissue grafts and organ transplants. Rifampicin proved its worth for treating tuberculosis; cimetidine (Tagamet), the first histamine blocker, became available for treating peptic ulcers. Throughout the decade, improvements in analytical instrumentation, including high-pressure liquid chromatography (HPLC) and mass spectrometry, made drug purification and analysis easier than ever before. In this period, NMR became transformed into the medical imaging system, MRI. The popular environmental movement that took root in the ideology of the previous decade blossomed politically in 1970 as the first Earth Day was celebrated, the U.S. Clean Air Act was passed, and the U.S. Environmental Protection Agency (EPA) was established. Some of the optimism of the Sixties faded as emerging plagues, such as Lyme and Legionnaires’ disease in the United States and Ebola and Lassa fever in Africa, reopened the book on infectious diseases. The World Health Organization continued its smallpox eradication campaign, but as DDT was gradually withdrawn because of its detrimental effect on the environment, efforts to eradicate malaria and sleeping sickness were imperiled. Ultimately, the 1970s saw the start of another kind of infection—genetic engineering fever—as recombinant DNA chemistry dawned. In 1976, in a move to capitalize on the new discoveries, Genentech Inc. (San Francisco) was founded and became the prototypical entrepreneurial biotech company. The company’s very existence forever transformed the nature of technology investments and the pharmaceutical industry.

Cancer wars
The first major salvo against cancer in the United States was in 1937, when the National Cancer Institute (NCI) was created by congressional mandate. A sign of the times, one of the key provisions of the act was to enable the institute to “procure, use, and lend radium.” Such an early interest in cancer was by no means unique to the United States. Throughout the period, Nazi Germany led the world in cancer research, including early demonstrations of the carcinogenic effects of smoking tobacco. Cancer remained of interest to researchers throughout the world in the decades to follow. In the 1950s, a major move was made to develop chemotherapies for various cancers. By 1965, the NCI had instituted a program specifically for drug development with participation from the NIH, industry, and universities. The program screened 15,000 new chemicals and natural products each year for potential effectiveness. Still, by the 1970s, there seemed to be a harsh contrast between medical success against infectious diseases and cancer. The 1971 report of the National Panel of Consultants on the Conquest of Cancer (called the Yarborough Commission) formed the basis of the 1971 National Cancer Act signed by President Nixon. The aim of the act was to make “the conquest of cancer a national crusade” with an initial financial boost of $500 million (which was allocated under the direction of the long-standing NCI). The Biomedical Research and Research Training Amendment of 1978 added basic research and prevention to the mandate for the continuing program.

Daughters and sons
The story of the synthetic estrogen, DES, framed cancer as a complex societal problem and not “just” an issue dealing with a particular new source of cancer. DES daughters not only crossed generations, but also involved interactions between patients, the drug industry, uninformed or wrongly informed physicians, and the political interests of Congress and the FDA. DES was prescribed from the early 1940s until 1971 to help prevent certain complications of pregnancy, especially those that led to miscarriages. By the 1960s, DES use was decreasing because of evidence that the drug lacked effectiveness and might indeed have damaging side effects, although no ban or general warning to physicians was issued. According to the University of Pennsylvania Cancer Center (Philadelphia), there are few reliable estimates of the number of women who took DES, although one source estimates that 5–10 million women either took the drug during pregnancy or were exposed to it in utero. 1

In 1970, a study in the journal Cancer described a rare form of vaginal cancer, clear cell adenocarcinoma (CCAC). The following year, a study in The New England Journal of Medicine documented the association between in utero DES exposure and the development of CCAC. By the end of that year, the FDA issued a drug bulletin warning of potential problems with DES and advised against its use during pregnancy. So-called DES daughters experienced a wide variety of effects including infertility, reproductive tract abnormalities, and increased risks of vaginal cancer. More recently, a number of DES sons were also found to have increased levels of reproductive tract abnormalities. In 1977, inspired by the tragedies caused by giving thalidomide and DES to pregnant women, the FDA recommended against including women of child-bearing potential in the early phases of drug testing except for life-threatening illnesses. The discovery of DES in beef from hormone-treated cattle after the FDA drug warning led in 1979 to what many complained was a long-delayed ban on its use by farmers. The DES issue was one of several that helped focus part of the war against cancer as a fight against environmental carcinogens (see below).

Cancer research/cancer “cures”

New evidence at the beginning of the decade seemed to promote an infectious model of cancer development. In 1970, Howard Martin Temin (at the University of Wisconsin– Madison) and David Baltimore (at the Massachusetts Institute of Technology; MIT) independently discovered viral reverse transcriptase, showing that some RNA viruses (the retroviruses), in their own version of genetic engineering, were capable of creating DNA copies of themselves. The viral DNA was able to integrate into the infected host cell, which then transformed into a cancer cell. Reverse transcriptase eventually became critical to the study of the aids virus in following decades. Temin and Baltimore shared the 1975 Nobel Prize in Physiology or Medicine with Renato Dulbecco of the Salk Institute. Many claims of cancer virus discoveries based on animal studies came and went early in the decade, but they proved inapplicable to humans. Hopes of treating the class of diseases known as cancers with traditional vaccination techniques declined. Other research developments helped expand knowledge of the mechanics and causes of cancer. In 1978, for example, the cancer suppressor gene P53 was first observed by David Lane at the University of Dundee. By 1979, it was possible to use DNA from malignant cells to transform cultured mouse cells into tumors— creating an entirely new tool for cancer study. Although many treatments for cancer existed at the beginning of the 1970s, few real cures were available. Surgical intervention was the treatment of choice for apparently defined tumors in readily accessible locations. In other cases, surgery was combined with or replaced by chemotherapy and/or radiation therapy. Oncology remained, however, as much an art as a science in terms of actual cures. Too much depended on too many variables for treatments to be uniformly applicable or uniformly beneficial. There were, however, some obvious successes in the 1970s. Donald Pinkel of St. Jude’s Hospital (Memphis) developed the first cure for acute lymphoblastic leukemia, a childhood cancer, by combining chemotherapy with radiotherapy. 2

BIOGRAPHY: Paul Berg and Asilomar In 1971, Janet Mertz, a colleague of Paul Berg at Stanford University, proposed inserting DNA from SV40 (a monkey tumor virus) into Escherichia coli. Robert Pollack of Cold Spring Harbor Laboratory phoned Berg about his concerns over the safety of the experiment, and it was postponed. In 1972, Berg used the new restriction enzymes and spliced together fragments of DNA from SV40 and E. coli in the first successful recombinant DNA experiment. By June 1973, the safety issues of experiments involving animal viruses were discussed at a Gordon Research Conference. In a unique moment of scientific selfquestioning, conference co-chairs Maxine Singer of the National Institutes of Health (NIH) and Dieter Soll of Yale University sent a letter to the National Academy of Sciences requesting the appointment of a study committee on rDNA issues. In 1974, as chair of the NAS committee, Paul Berg wrote letters to Science and Nature asking for a temporary worldwide moratorium on certain types of research and calling for an international conference to discuss the potential problems. In 1975, the conference was held at Asilomar, CA, where it was recommended that most research should go on with appropriate physical and biological containment. The Asilomar conference has been held up by some as a model of scientific selfpolicing and by others as a self-serving attempt to avoid excessive regulatory intervention by striking first. Asilomar was one of the most profound influences on the development of the NIH’s 1976 guidelines on rDNA. These guidelines, although continuing to evolve—generally to less and less restrictiveness—have remained the touchstone for the field.

The advent of allogenic (foreign donor) bone marrow transplants in 1968 made such treatments possible, but the real breakthrough in using powerful radiation and chemotherapy came with the development of autologous marrow transplantation. The method was first used in 1977 to cure patients with lymphoma. Autologous transplantation involves removing and usually cryopreserving a patient’s own marrow and reinfusing that marrow after the administration of high-dosage drug or radiation therapy. Because autologous marrow can contain contaminating tumor cells, a variety of methods have been established to attempt to remove or deactivate them, including antibodies, toxins, and even in vitro chemotherapy. E. Donnall Thomas of the Fred Hutchinson Cancer Research Center (Seattle) was instrumental in developing bone marrow transplants and received the 1990 Nobel Prize in Physiology or Medicine for his work. Although bone marrow transplants were originally used primarily to treat leukemias, by the end of the century, they were used successfully as part of high-dose chemotherapy regimes for Hodgkin’s disease, multiple myeloma, neuroblastoma, testicular cancer, and some breast cancers. In 1975, a WHO survey showed that death rates from breast cancer had not declined since 1900. Radical mastectomy was ineffective in many cases because of late diagnosis and the prevalence of undetected metastases. The search for alternative and supplemental treatments became a high research priority. In 1975, a large cooperative American study demonstrated the benefits of using phenylalanine mustard following surgical removal of the cancerous breast. Combination therapies rapidly proved even more effective; and by 1976, CMF (cyclophosphamide, methotrexate, and 5-fluorouracil) therapy was developed at the Instituto Nazionale Tumori in Milan, Italy. It proved to be a radical improvement over surgery alone and rapidly became the chemotherapy of choice for this disease.

3

A new environment

Launched in part by Rachel Carson’s book, Silent Spring, in the previous decade, the environmental movement in the West became ever more prominent. The first Earth Day was held on April 22, 1970, to raise environmental awareness. The EPA was launched on December 2, and Nixon closed out the year by signing the Clean Air Act on December 31. The concept of carcinogens entered the popular consciousness. Ultimately, the combination of government regulations and public fears of toxic pollutants in food, air, and water inspired improved technologies for monitoring extremely small amounts of chemical contaminants. Advances were made in gas chromatography, ion chromatography, and especially the EPA-approved combination of GC/MS. The 1972 Clean Water Act and the Federal Insecticide and Rodenticide Act added impetus to the need for instrumentation and analysis standards. By the 1980s, many of these improvements in analytical instrumentation had profound effects on the scientific capabilities of the pharmaceutical industry. One example is atomic absorption spectroscopy, which in the 1970s made it possible to assay trace metals in foods to the parts-per-billion range. The new power of such technologies enabled nutritional researchers to determine, for the first time, that several trace elements (most usually considered pollutants) were actually necessary to human health. These included tin (1970), vanadium (1971), and nickel (1973). In 1974, the issue of chemical-induced cancer became even broader when F. Sherwood Rowland of the University of California–Irvine and Mario Molina of MIT demonstrated that chlorofluorocarbons (CFCs) such as Freon could erode the UV-absorbing ozone layer. The predicted results were increased skin cancer and cataracts, along with a host of adverse effects on the environment. This research led to a ban of CFCs in aerosol spray cans in the United States. Rowland and Molina shared the 1995 Nobel Prize in Chemistry for their ozone work with Paul Crutzen of the Max Planck Institute for Chemistry (Mainz, Germany). By 1977, asbestos toxicity had become a critical issue. Researchers at the Mount Sinai School of Medicine (New York) discovered that asbestos inhalation could cause cancer after a latency period of 20 years or more. This discovery helped lead to the passage of the Toxic Substances Control Act, which mandated that the EPA inventory the safety of all chemicals marketed in the United States before July 1977 and required manufacturers to provide safety data 90 days before marketing any chemicals produced after that date. Animal testing increased where questions existed, and the issue of chemical carcinogenicity became prominent in the public mind and in the commercial sector. Also in the 1970s, DDT was gradually withdrawn from vector eradication programs around the world because of the growing environmental movement that resulted in an outright ban on the product in the United States in 1971. This created a continuing controversy, especially with regard to who attempts to eliminate malaria and sleeping sickness in the developing world. In many areas, however, the emergence of DDT-resistant insects already pointed to the eventual futility of such efforts. Although DDT was never banned completely except by industrialized nations, its use declined dramatically for these reasons.

SOCIETY: Good practices In the 1970s, numerous problems in toxicity testing of food and drugs surfaced as an issue of public safety. From 1975 to 1977, the U.S. Senate subcommittee on health investigated Aldactone, a Searle product. Rats given the drug in long-term studies had not been examined microscopically after death, contrary to FDA requirements, despite the fact that some had lesions visible to the naked eye. In addition, the FDA contended that poor records had been maintained during tests of the sweetener aspartame and the antibacterial drug Flagyl. The FDA contacted the Justice Department over what it saw as a pattern of bad laboratory practices, but the case for prosecution was considered insufficient. Other cases contributed to the need for reform. From the 1950s through the 1970s, Industrial Bio-Test Laboratories (IBT) performed about 35–40% of U.S. toxicology testing. Of the 867 audits of IBT performed by the FDA, 618 found discrepancies between the conduct of the study and the data presented. Four IBT managers were found guilty of fraud. As a result of such incidents, in 1976 the FDA Good Laboratory Practice guidelines were proposed to provide for inspections and data audits of nonclinical labs (toxicology labs) conducting animal safety testing of FDA-regulated commodities. They were finalized in 1978 and became effective in 1979. Good Manufacturing Practices (GMPs) were also authorized in 1976 to ensure quality standards in the commercial production of FDA-regulated products.

Recombinant DNA and more
In 1970, two years before the birth of recombinant DNA (rDNA) technology, cytogeneticist Robert John Cecil Harris coined the term “genetic engineering.” 4

But more importantly, in 1970, Werner Arber of the Biozentrum der Universität Basel (Switzerland) discovered restriction enzymes. Hamilton O. Smith at Johns Hopkins University (Baltimore) verified Arber’s hypothesis with a purified bacterial restriction enzyme and showed that this enzyme cuts DNA in the middle of a specific symmetrical sequence. Daniel Nathans, also at Johns Hopkins, demonstrated the use of restriction enzymes in the construction of genetic maps. He also developed and applied new methods of using restriction enzymes to solve various problems in genetics. The three scientists shared the 1978 Nobel Prize in Physiology or Medicine for their work in producing the first genetic map (of the SV40 virus). In 1972, rDNA was born when Paul Berg of Stanford University demonstrated the ability to splice together bluntend fragments of widely disparate sources of DNA. That same year, Stanley Cohen from Stanford and Herbert Boyer from the University of California, San Francisco, met at a Waikiki Beach delicatessen where they discussed ways to combine plasmid isolation with DNA splicing. They had the idea to combine the use of the restriction enzyme EcoR1 (which Boyer had discovered in 1970 and found capable of creating “sticky ends”) with DNA ligase (discovered in the late 1960s) to form engineered plasmids capable of producing foreign proteins in bacteria—the basis for the modern biotechnology industry. By 1973, Cohen and Boyer had produced their first recombinant plasmids. They received a patent on this technology for Stanford and UCSF that would become one of the biggest money-makers in pharmaceutical history. The year 1975 was the year of DNA sequencing. Walter Gilbert and Allan Maxam of Harvard University and Fred Sanger of Cambridge University simultaneously developed different methods for determining the sequence of bases in DNA with relative ease and efficiency. For this accomplishment, Gilbert and Sanger shared the 1980 Nobel Prize in Physiology or Medicine. By 1976, Silicon Valley venture capitalist Robert Swanson teamed up with Herbert Boyer to form Genentech Inc. (short for genetic engineering technology). It was the harbinger of a wild proliferation of biotechnology companies over the next decades. Genentech’s goal of cloning human insulin in Escherichia coli was achieved in 1978, and the technology was licensed to Eli Lilly. In 1977, the first mammalian gene (the rat insulin gene) was cloned into a bacterial plasmid by Axel Ullrich of the Max Planck Institute. In 1978, somatostatin was produced using rDNA techniques. The recombinant DNA era grew from these beginnings and had a major impact on pharmaceutical production and research in the 1980s and 1990s.

5

High tech/new mech

In June 1970, Raymond V. Damadian at the State University of New York discovered that cancerous tissue in rats exhibited dramatically prolonged NMR relaxation times. He also found that the relaxation times of normal tissues also vary significantly, although less dramatically than cancer tissue. Damadian’s March 1971 Science article, “Tumor Detection by Nuclear Magnetic Resonance,” became the basis for magnetic resonance imaging (MRI)’s pioneer patent, issued to him in 1974, which included a description of a three-dimensional in vivo method for obtaining diagnostic NMR signals from humans. In 1977, Damadian and colleagues achieved the first NMR image of a human in a wholebody MRI scanner. In 1988, Damadian was awarded the National Medal of Technology. MRI became one of the most sensitive and useful tools for disease diagnosis, and the basis of MR spectroscopy, itself one of the most sophisticated in vivo physiological research tools available by the end of the century. In 1971, the first coaxial tomography scanner was installed in England. By 1972, the first whole-body computed tomography (CT) scanner was marketed by Pfizer. That same year, the Brookhaven Linac Isotope Producer went on line, helping to increase the availability of isotopes for medical purposes and basic research. In 1977, the first use of positron emission tomography (PET) for obtaining brain images was demonstrated. The 1970s saw a revolution in computing with regard to speed, size, and availability. In 1970, Ted Hoff at Intel invented the first microprocessor. In 1975, the first personal computer, the Altair, was put on the market by American inventor Ed Roberts. Also in 1975, William Henry Gates III and Paul Gardner Allen founded Microsoft. And in 1976, the prototype for the first Apple Computer (marketed as the Apple II in 1977) was developed by Stephen Wozniak and Steven Jobs. It signaled the movement of personal computing from the hobbyist to the general public and, more importantly, into pharmaceutical laboratories where scientists used PCs to augment their research instruments. In 1975, Edwin Mellor Southern of the University of Oxford invented a blotting technique for analyzing restriction enzyme digest fragments of DNA separated by electrophoresis. This technique became one of the most powerful technologies for DNA analysis and manipulation and was the conceptual template for the development of northern blotting (for RNA analysis) and western blotting (for proteins). Although HPLC systems had been commercially available from ISCO since 1963, they were not widely used until the 1970s when—under license to Waters Associates and Varian Associates—the demands of biotechnology and clinical practice made such systems seem a vibrant new technology. By 1979, Hewlett-Packard was offering the first microprocessor-controlled HPLC, a technology that represented the move to computerized systems throughout the life sciences and instrumentation in general. Throughout the decade, GC and MS became routine parts of life science research, and the first linkage of LC/MS was offered by Finnigan. These instruments would have increasing impact throughout the rest of the century.

TECHNOLOGY: Monoclonal antibodies What would eventually prove to be one of the most powerful tools in applied immunology and basic research and therapeutics was developed in 1975 by César Milstein of the MRC Laboratory of Molecular Biology (Cambridge, U.K.) and by Georges Köhler and Niels Jerne of the Basel Institute for Immunology (Basel, Switzerland). Monoclonal antibodies were produced by fusing antibodyproducing B lymphocyte cells from mice with tumor cells to produce “hybridomas” — cell lines that combined the ability to synthesize a unique antibody from the lymphocytes (hence the name monoclonal) with the immortal growth potential of the tumor. For the first time, a permanent, self-replicating, in vitro source of individualized antibodies was available for diagnostics and therapeutics. One of the earliest triumphs of the use of this technology (in combination with fluorescent labeling) was the demonstration of the existence of the cytoskeleton (the structural framework of cellular cytoplasm). The three researchers were awarded the 1984 Nobel Prize in Physiology or Medicine for their achievement.

(Re)emerging diseases
In 1969, U.S. Surgeon General William Stewart claimed in testifying before Congress that “the time has come to close the book on infectious diseases.” He believed that it was especially time to reinvest money to treat killers such as heart disease and cancer, since, in his opinion, it was only a matter of time before the war against infection would be won by a combination of antibiotics and vaccines. The traditional diseases were indeed on the run before the onslaught of modern medicines. What he could not foresee was the emergence of new diseases and the reemergence of old plagues in the guise of drug-resistant strains. The 1970s helped throw cold water on this kind of optimism, perhaps signaled by the shift in emphasis implied by the 1970 name change of the Communicable Disease Center to the Centers for Disease Control. 6

There were certainly enough new diseases and “old friends” to control. For example, in 1972, the first cases of recurrent polyarthritis (Lyme disease) were recorded in Old Lyme and Lyme, CT, ultimately resulting in the spread of the tick-borne disease throughout the hemisphere. The rodent-borne arena viruses were identified in the 1960s and shown to be the causes of numerous diseases seen since 1934 in both developed and less-developed countries. Particularly deadly was the newly discovered Lassa fever virus, first identified in Africa in 1969 and responsible for small outbreaks in 1970 and 1972 in Nigeria, Liberia, and Sierra Leone, with a mortality rate of some 36–38%. Then, in 1976, epidemics of a different hemorrhagic fever occurred simultaneously in Zaire and Sudan. Fatalities reached 88% in Zaire (now known as the Democratic Republic of the Congo) and 53% in Sudan, resulting in a total of 430 deaths. Ebola virus, named after a small river in northwest Zaire, was isolated from both epidemics. In 1977, a fatality was attributed to Ebola in a different area of Zaire. The investigation of this death led to the discovery that there were probably two previous fatal cases. A missionary physician contracted the disease in 1972 while conducting an autopsy on a patient thought to have died of yellow fever. In 1979, the original outbreak site in Sudan generated a new episode of Ebola hemorrhagic fever that resulted in 34 cases with 22 fatalities. Investigators were unable to discover the source of the initial infections. The dreaded nature of the disease—the copious bleeding, the pain, and the lack of a cure—sent ripples of concern throughout the world’s medical community. In 1976, the unknown “Legionnaires’ disease” appeared at a convention in Philadelphia, killing 29 American Legion convention attendees. The cause was identified as the newly discovered bacterium Legionella. Also in 1976, a Nobel Prize in Physiology or Medicine was awarded to Baruch Blumberg (originally at the NIH, then at the University of Pennsylvania) for the discovery of a new disease agent in 1963—hepatitis B, for which he helped to develop a blood test in 1971. Nonetheless, one bit of excellent news at the end of the decade made such “minor” outbreaks of new diseases seem trivial in the scope of human history. From 1350 B.C., when the first recorded smallpox epidemic occurred during the Egyptian–Hittite war, to A.D. 180, when a large-scale epidemic killed between 3.5 and 7 million people (coinciding with the first stages of the decline of the Roman Empire), through the millions of Native Americans killed in the 16th century, smallpox was a quintessential scourge. But in 1979, a WHO global commission was able to certify the worldwide eradication of smallpox, achieved by a combination of quarantines and vaccination. The last known natural case of the disease occurred in 1977 in Somalia. Government stocks of the virus remain a biological warfare threat, but the achievement may still, perhaps, be considered the most unique event in the Pharmaceutical Century—the disease-control equivalent of landing a man on the moon. By 1982, vaccine production ceased. By the 1990s, a controversy arose between those who wanted to maintain stocks in laboratories for medical and genetic research purposes (and possibly as protection against clandestine biowarfare) and those who hoped to destroy the virus forever. On a lesser but still important note, the first leprosy vaccine using the nine-banded armadillo as a source was developed in 1979 by British physician Richard Rees at the National Institute for Medical Research (Mill Hill, London).

Toward a healthier world
The elimination of smallpox was just part of a large-scale move in the 1970s to deal with the issue of global health. In 1974, the WHO launched an ambitious Expanded Program on Immunization to protect children from polio myelitis, measles, diphtheria, whooping cough, tetanus, and tuberculosis. Throughout the 1970s and the rest of the century, the role of DDT in vector control continued to be a controversial issue, especially for the eradication or control of malaria and sleeping sickness. The WHO would prove to be an inconsistent ally of environmental groups that urged a ban of the pesticide. The organization’s recommendations deemphasized the use of the compound at the same time that its reports emphasized its profound effectiveness. Of direct interest to the world pharmaceutical industry, in 1977 the WHO published the first Model List of Essential Drugs—“208 individual drugs which could together provide safe, effective treatment for the majority of communicable and noncommunicable diseases.” This formed the basis of a global movement toward improved health provision as individual nations adapted and adopted this list of drugs as part of a program for obtaining these universal pharmacological desiderata.

7

That ’70s showdown

The decade lurched from the OPEC oil embargoes to Watergate, Record of Medical Progress over through stagflation, world famine, and hostage crises to a final Three Millennia, Sebastian, A., Ed. realization, in 1979, that it all might have started with a comet or (Parthenon Publishing Group: New asteroid that crashed into the earth 65 million years before, killing off York, 2000) the dinosaurs. And that perhaps it might end the same way. • The DNA Story: A Documentary Medical marvels and new technologies promised excitement at the History of Gene Cloning, Watson, J. same time that they revealed more problems to be solved. A new D. and Tooze, J. (W. H. Freeman wave of computing arose in Silicon Valley. Oak Ridge National and Co.: San Francisco, 1981) Laboratory detected a single atom for the first time—one atom of • Readings in American Health Care: Current Issues in Socio-Historical cesium in the presence of 1019 argon atoms and 1018 methane Perspective, Rothstein, W. G., Ed. molecules—using lasers. And biotechnology was born as both a (University of Wisconsin Press: research program and a big business. The environmental movement Madison, 1995) contributed not only a new awareness of the dangers of carcinogens, • The Social Transformation of but a demand for more and better analytical instruments capable of American Medicine, Starr, P. (Basic extending the range of chemical monitoring. These would make their Books Inc.: New York, 1982) way into the biomedical field with the demands of the new • Two Centuries of American biotechnologies. Medicine: 1776–1976, Bordley, J. III The pharmaceutical industry would enter the 1980s with one eye on and A. Harvey, A. M. (W. B. its pocketbook, to be sure, feeling harried by economics and a host of Saunders Co.: Philadelphia, 1976) new regulations, both safety and environmental. But the other eye looked to a world of unimagined possibilities transformed by the new DNA chemistries and by new technologies for analysis and computation.

Suggested reading • Dates in Medicine: A Chronological

8

Arteries, AIDS, and Engineering
Introduction Whether the changes to the pharmaceutical industry and the world in the 1980s will prove most
notable for the rise of and reaction to a new disease, AIDS, or the flowering of entrepreneurial biotechnology and genetic engineering, it is too soon to say. These changes—along with advances in immunology, automation, and computers, the development of new paradigms of cardiovascular and other diseases, and restructured social mores— all competed for attention in the transformational 1980s.

AIDS: A new plague
It struck the big cities first, and within those cities, at first, it only affected certain segments of the population, primarily homosexual men. The first published reports of the new disease seemed like no more than medical curiosities. On June 5, 1981, the Atlanta-based Centers for Disease Control and Prevention (CDC), a federal agency charged with keeping tabs on disease, published an unusual notice in its Morbidity and Mortality Weekly Report: the occurrence of Pneumocystis carinii pneumonia (PCP) among gay men. In New York, a dermatologist encountered cases of a rare cancer, Kaposi’s sarcoma (KS), a disease so obscure he recognized it only fromdescriptions in antiquated textbooks. By the end of 1981, PCF and KS were recognized as harbingers of a new and deadly disease. The disease was initially called Gay Related Immune Deficiency. Within a year, similar symptoms appeared in other demographic groups, primarily hemophiliacs and users of intravenous drugs. The CDC renamed the disease Acquired Immune Deficiency Syndrome (AIDS). By the end of 1983, the CDC had recorded some 3000 cases of this new plague. The prospects for AIDS patients were not good: almost half had already died. AIDS did not follow normal patterns of disease and infection. It produced no visible symptoms—at least not until the advanced stages of infection. Instead of triggering an immune response, it insidiously destroyed the body’s own mechanisms for fighting off infection. People stricken with the syndrome died of a host of opportunistic infections such as rare viruses, fungal infections, and cancers. When the disease began to appear among heterosexuals, panic and fear increased. Physicians and scientists eventually mitigated some of the hysteria when they were able to explain the methods of transmission. As AIDS was studied, it became clear that the disease was spread through intimate contact such as sex and sharing syringes, as well as transfusions and other exposure to contaminated blood. It could not be spread through casual contact such as shaking hands, coughing, or sneezing. In 1984, Robert Gallo of the National Cancer Institute (NCI) and Luc Montagnier of the Institut Pasteur proved that AIDS was caused by a virus. There is still a controversy over priority of discovery. However, knowledge of the disease’s cause did not mean readiness to combat the plague. Homophobia and racism, combined with nationalism and fears of creating panic in the blood supply market, contributed to deadly delays before action was taken by any government. The relatively low number of sufferers skyrocketed around the world and created an uncontrollable epidemic.

Immunology comes of age
The 1980s were a decade of worldwide interest in immunology, an interest unmatched since the development of the vaccine era at the beginning of the century. In 1980, the Nobel Prize in Physiology or Medicine went to three scientists for their work in elucidating the genetic basis of the immune system. Baruj Benacerraf at Harvard University (Cambridge, MA), George Snell of the Jackson Laboratory (Bar Harbor, ME), and Jean Dausset of the University of Paris explored the genetic basis of the immune response. Their work demonstrated that histocompatibility antigens (called H-factors or H-antigens) determined the interaction of the myriad cells responsible for an immunological response. Early efforts to study immunology were aimed at understanding the structure and function of the immune system, but some scientists looked to immunology to try to understand diseases that lacked a clear outside agent. In some diseases, some part of the body appears to be attacked not by an infectious agent but by the immune system. Physicians and researchers wondered if the immune system could cause, as well as defend against, disease. By the mid-1980s, it was clear that numerous diseases, including lupus and rheumatoid arthritis, were connected to an immune system malfunction. These were called “autoimmune diseases” because they were caused by a patient’s own immune system. Late in 1983, juvenile-onset diabetes was shown to be an autoimmune disease in which the body’s immune system attacks insulin-producing cells in the pancreas. Allergies were also linked to overreactions of the immune system. By 1984, researchers had discovered an important piece of the puzzle of immune system functioning. Professor Susumu Tonegawa and his colleagues discovered how the immune system recognizes “self” versus “not-self”—a 9

key to immune system function. Tonegawa elucidated the complete structure of the cellular T cell receptor and the genetics governing its production. It was already known that T cells were the keystone of the entire immune system. Not only do they recognize self and not-self and so determine what the immune system will attack, they also regulate the production of B cells, which produce antibodies. Immunologists regarded this as a major breakthrough, in large part because the human immunodeficiency virus (HIV) that causes AIDS was known to attack T cells. The T cell response is also implicated in other autoimmune diseases and many cancers in which T cells fail to recognize not-self cells. The question remained, however, how the body could possibly contain enough genes to account for the bewildering number of immune responses. In 1987, for the third time in the decade, the Nobel Prize in Physiology or Medicine went to scientists working on the immune system. As Tonegawa had demonstrated in 1976, the immune system can produce an almost infinite number of responses, each of which is tailored to suit a specific invader. Tonegawa showed that rather than containing a vast array of genes for every possible pathogen, a few genetic elements reshuffled themselves. Thus a small amount of genetic information could account for many antibodies. The immune system relies on the interaction of numerous kinds of cells circulating throughout the body. Unfortunately, AIDS was known to target those very cells. There are two principal types of cells, B cells and T cells. T cells, sometimes called “helper” T cells, direct the production of B cells, an immune response targeted to a single type of biological or chemical invader. There are also “suppressor” cells to keep the immune response in check. In healthy individuals, helpers outnumber suppressors by about two to one. In immunocompromised individuals, however, the T cells are exceedingly low and, accordingly, the number of suppressors extremely high. This imbalance appears capable of shutting down the body’s immune response, leaving it vulnerable to infections a healthy body wards off with ease. Eventually, scientists understood the precise mechanism of this process. Even before 1983, when the viral cause of the disease was determined, the first diagnostic tests were developed to detect antibodies related to the disease. Initially, because people at risk for AIDS were statistically associated with hepatitis, scientists used the hepatitis core antibody test to identify people with hepatitis, and therefore, at risk for AIDS. By 1985, a diagnostic method was specifically designed to detect antibodies produced against the low titer HIV itself. Diagnosing infected individuals and protecting the valuable world blood supply spurred the diagnosis effort.

10

By the late 1980s, under the impetus and fear associated with AIDS, both immunology and virology received huge increases in research funding, especially from the U.S. government.

Pharmaceutical technology and AIDS

Knowing the cause of a disease and how to diagnose it is quite a different matter from knowing how to cure it. Although by no means the solution, ironically, the pharmaceutical technology initially used to combat HIV was discovered some 20 years before AIDS appeared. Azidothymidine (AZT) was developed in 1964 as an anticancer drug by Jerome Horowitz of the Michigan Cancer Foundation (Detroit). But because AZT was ineffective against cancer, Horowitz never filed a patent. In 1987, the ultimate approval of AZT as an antiviral treatment for AIDS was the result of both the hard technology of the laboratory and the soft technologies of personal outrage and determination (and deft use of the 1983 Orphan Drug Act). Initially, the discovery of the viral nature of AIDS resulted in little, if any, R&D in corporate circles. The number of infected people was considered too small to justify the cost of new drug development, and most scientists thought retroviruses were untreatable. However, Sam Broder, a physician and researcher at the NCI, thought differently. As clinical director of the NCI’s 1984 Special Task Force on AIDS, Broder was determined to do something. Needing a larger lab, Broder went to the pharmaceutical industry for support. As Broder canvassed the drug industry, he promised to test potentially useful compounds in NCI labs if the companies would commit to develop and market drugs that showed potential. One of the companies he approached was Burroughs Wellcome, the American subsidiary of the British firm Wellcome PLC. Burroughs Wellcome had worked on nucleoside analogues, a class of antivirals that Broder thought might work against HIV. Burroughs Wellcome had successfully brought to market an antiherpes drug, acyclovir. Although many companies were reluctant to work on viral agents because of the health hazards to researchers, Broder persevered. Finally, Burroughs Wellcome and 50 other companies began to ship chemicals to the NCI labs for testing. Each sample was coded with a letter to protect its identity and to protect each company’s rights to its compounds. In early 1985, Broder and his team found a compound that appeared to block the spread of HIV in vitro. Coded Sample S, it was AZT sent by Burroughs Wellcome. There is a long road between in vitro efficacy and shipping a drug to pharmacies—a road dominated by the laborious approval process of the FDA. The agency’s mission is to keep dangerous drugs away from the American public, and after the thalidomide scare of the late 1950s and early 1960s, the FDA clung tenaciously to its policies of caution and stringent testing. However, with the advent of AIDS, many people began to question that caution. AZT was risky. It could be toxic to bone marrow and cause other less drastic side effects such as sleeplessness, headaches, nausea, and muscular pain. Even though the FDA advocated the right of patients to knowingly take experimental drugs, it was extremely reluctant to approve AZT. Calls were heard to reform or liberalize the approval process, and a report issued by the General Accounting Office (GAO) claimed that of 209 drugs approved between 1976 and 1985, 102 had caused serious side effects, giving lie to the apparent myth that FDA approval automatically safeguarded the public. The agendas of patient advocates, ideological conservatives who opposed government “intrusion,” and the pharmaceutical industry converged in opposition to the FDA’s caution. But AIDS attracted more than its share of false cures, and the FDA rightly kept such things out of the medical mainstream. Nonetheless, intense public demand (including protests by AIDS activist groups such as act up) and unusually speedy testing brought the drug to the public by the late 1980s. AZT was hard to tolerate and, despite misapprehensions of its critics, it was never 11

SOCIETY: Orphans, generics & patents, oh my! Two laws that were passed in the United States in the 1980s had an especially profound influence on the status of the world pharmaceutical industry and on subsequent drug development. The 1983 Orphan Drug Act designated orphan products as those applicable to fewer than 200,000 patients. Benefits of orphan drug designations include seven years of exclusivity of marketing a drug after approval (even if the patent has expired), tax credits of up to 50% of the human clinical research costs, and FDA assistance, if requested, in speeding product development. Some orphan drugs, such as AZT, can prove enormously profitable, and companies have been eager to make use of the designation. The 1984 Drug Price Competition and Patent Term Restoration Act provided defined patent extensions to account for the time a drug is tested and put through the FDA approval process. The act also enables generic drugs that are proven the same as or equivalent to a drug already listed by the FDA to be exempt from repeating clinical trials before marketing permission is granted. This opened the floodgates to a wave of generics that would transform the pharmaceutical industry.

thought to be a magic bullet that would cure AIDS. It was a desperate measure in a desperate time that at best slowed the course of the disease. AZT was, however, the first of what came to be a major new class of antiviral drugs. Its approval process also had ramifications. The case of AZT became the tip of the iceberg in a new world where consumer pressures on the FDA, especially from disease advocacy groups and their congressional supporters, would lead to more and rapid approval of experimental drugs for certain conditions. It was a controversial change that in the 1990s would create more interest in “alternative medicine,” nutritional supplements, fast-track drugs, and attempts to further weaken the FDA’s role in the name of both consumer rights and free enterprise.

Computers and pharmaceuticals
In the quest for new and better drugs, genetic and biological technologies came to the fore in the 1980s. These included a combination of “hard” (machine-based) and “wet” (wet chemistry-based) technologies. In the early 1980s, pharmaceutical companies hoped that developments in genetic engineering and other fields of molecular biology would lead to sophisticated drugs with complicated structures that could act in ways as complicated and precise as proteins. Through understanding the three-dimensional structure and hence the function of proteins, drug designers interested in genetic engineering hoped they could create protein-based drugs that replicated these structures and functions. Unfortunately for the industry, its shareholders, and sick people who might have been helped by these elegantly tailored compounds, it did not turn out that way. By the end of the decade, it was clear that whatever economic usefulness there was in molecular biology developments (via increased efficiency of chemical drugs), such developments did not yet enable the manufacturing of complex biologically derived drugs. So while knowledge of protein structure has supplied useful models for cell receptors—such as the CD4 receptors on T cells, to which drugs might bind—it did not produce genetic “wonder drugs.” Concomitant with this interest in molecular biology and genetic engineering was the development of a new way of conceiving drug R&D: a new soft technology of innovation. Throughout the history of the pharmaceutical industry, discovering new pharmacologically active compounds depended on a “try and see” empirical approach. Chemicals were tested for efficacy in what was called “random screening” or “chemical roulette,” names that testify to the haphazard and chancy nature of this drug discovery approach. The rise of molecular biology, with its promise of genetic engineering, fostered a new way of looking at drug design. Instead of an empirical “try and see” method, pharmaceutical designers began to compare knowledge of human physiology and the causes of medical disorders with knowledge of drugs and their methods of physiological action to conceptualize the right molecules. This ideal design is then handed over to research chemists in the laboratory, who search for a close match. In this quest, the biological model of cell receptor and biologically active molecule serve as a guide. The role of hard technologies in biologically based drug research cannot be overstated. In fact, important drug discoveries owe a considerable amount to concomitant technological developments, particularly in imaging technology and computers. X-ray crystallography, scanning electron microscopy, NMR, and laser and magnetic- and optical-based imaging techniques allow the visualization of atoms within a molecule. This capability is crucial, as it is precisely this three-dimensional arrangement that gives matter its chemically and biologically useful qualities. Computers were of particular use in this brave new world of drug design, and their power and capabilities increased dramatically during the 1980s. The increased computational power of computers enabled researchers to work through the complex mathematics that describe the molecular structure of idealized drugs. Drug designers, in turn, could use the increasingly powerful imaging capabilities of computers to convert mathematical protein models into three-dimensional images. Gone were the days when modelers used sticks, balls, and wire to create models of limited scale and complexity. In the 1980s, they used computers to transform mathematical equations into interactive, virtual pictures of elegant new models made up of thousands of different atoms. Since the 1970s, the pharmaceutical industry had been using computers to design drugs to match receptors, and it was one of the first industries to harness the steadily increasing power of computers to design molecules. Software applications to simulate drugs first became popular in the 1980s, as did genetics-based algorithms and fuzzy logic. Research on genetic algorithms began in the 1970s and continues today. Although not popular until the early 1990s, genetic algorithms allow drug designers to “evolve” a best fit to a target sequence through successive generations until a fit or solution is found. Algorithms have been used to predict physiological properties and bioactivity. Fuzzy logic, which formalizes imprecise concepts by defining degrees of truth or falsehood, has proven useful in modeling pharmacological action, protein structure, and receptors.

The business of biotechnology
The best drug in the world is worthless if it cannot be developed, marketed, or manufactured. By the mid-1980s, 12

small biotechnology firms were struggling for survival, which led to the formation of mutually beneficial partnerships with large pharmaceutical companies and a host of corporate buyouts of the smaller firms by big pharma. Eli Lilly was one of the big pharma companies that formed early partnerships with smaller biotech firms. Beginning in the 1970s, Lilly was one of the first drug companies to enter into biotechnology research. By the mid-1980s, Lilly had already put two biotechnology-based drugs into production: insulin and human growth hormone. Lilly produced human insulin through recombinant DNA techniques and marketed it, beginning in 1982, as Humulin. The human genes responsible for producing insulin were grafted into bacterial cells through a technique first developed in the production of interferon in the 1970s. Insulin, produced by the bacteria, was then purified using monoclonal antibodies. Diabetics no longer had to take insulin isolated from pigs. By 1987, Lilly ranked second among all institutions (including TECHNOLOGY: PCR/DNA universities) and first among companies (including both large drug automation manufacturers and small biotechnology companies) in U.S. patents One of the most significant examples of for genetically engineered drugs. By the late 1980s, Lilly a breakout technology was the recognized the link between genetics, modeling, and computational polymerase chain reaction (PCR) in the power and, already well invested in computer hardware, the 1980s. PCR was first reported in 1985 company moved to install a supercomputer. by Kary Mullis and colleagues at Cetus In 1988, typical of many of the big pharma companies, Lilly Corp. (Berkeley, CA). (Mullis won a formed a partnership with a small biotechnology company: controversial solo Nobel Prize in Agouron Pharmaceuticals, a company that specialized in threeChemistry for its development in 1993.) dimensional computerized drug design. The partnership gave Lilly Almost immediately with the marketing expertise in an area it was already interested in, as well as of automated PCR machines by Perkinmanufacturing and marketing rights to new products, while Elmer in 1986, the technique became Agouron gained a stable source of funding. Such partnerships indispensable throughout the united small firms that had narrow but potentially lucrative biotechnology industry. With PCR, specializations with larger companies that already had development scientists can replicate in vitro a and marketing structures in place. Other partnerships between selected segment or segments of DNA small and large firms allowed large drug companies to “catch up” a millionfold. With PCR and the in a part of the industry in which they were not strongly development of the automated represented. This commercialization of drug discovery allowed fluorescence sequencer by Leroy E. companies to apply the results of biotechnology and genetic Hood and colleagues at Caltech and engineering on an increasing scale in the late 1980s, a process that Applied Biosystems, Inc., in the early continues. 1980s, the necessary components for an explosion of sequencing, especially The rise of drug resistance in human genomics, were all in place by New developments in the West in the late 1980s had particular the end of the decade. implications for drugs and pharmaceutical technologies. Diseases that were thought to have been eliminated in developed countries reappeared in the late 1980s with a frightening twist: they had developed drug resistance. Tuberculosis in particular experienced a resurgence. In the mid-1980s, the worldwide decline in tuberculosis cases leveled off and then began to rise. New cases of tuberculosis were highest in less developed countries, and immigration was blamed for the increased number of cases in developed countries. (However, throughout the 20th century in the United States, tuberculosis was a constant and continued health problem for senior citizens, Native Americans, and the urban and rural poor. In 1981, an estimated 1 billion people—one-third of the world’s population—were infected. So thinking about tuberculosis in terms of a returned epidemic obscures the unabated high incidence of the disease worldwide over the preceding decades.)

13

The most troubling aspect of the “reappearance” of tuberculosis was its resistance to not just one or two drugs, but to multiple drugs. Multidrug resistance stemmed from several sources. Every use of an antibiotic against a microorganism is an incidence of natural selection in action—an evolutionary version of the Red Queen’s Race, when you have to run as fast as you can just to stay in place. Using an antibiotic kills susceptible organisms. Yet mutant organisms are present in every population. If even a single pathogenic organism survives, it can multiply freely. Agricultural and medical practices have contributed to resistant strains of various organisms. In agriculture, animal feed is regularly and widely supplemented with antibiotics in subtherapeutic doses. In medical practice, there has been widespread indiscriminate and inappropriate use of antibiotics to the degree that hospitals have become reservoirs of resistant organisms. Some tuberculosis patients unwittingly fostered multidrugresistant tuberculosis strains by failing to comply with the admittedly lengthy, but required, drug treatment regimen. In this context, developing new and presumably more powerful drugs and technologies became even more important. One such technology was combinatorial chemistry, a nascent science at the end of the 1980s. Combinatorial chemistry sought to find new drugs by, in essence, mixing and matching appropriate starter compounds and reagents and then assessing them for suitability. Computers were increasingly important as the approach matured. Specialized software was developed that could not only select appropriate chemicals but also sort through the potentially awesome number of potential drugs.

BIOGRAPHY: James Watson and genomics Riding the tide of his Nobel prize for codiscovering the structure of DNA in 1953, James Watson carved out a career as one of the chief promoters of molecular biology. In the 1980s, in particular, from his bully pulpit as head of Cold Spring Harbor (NY) Laboratory, Watson campaigned ceaselessly and successfully for relaxing the National Institutes of Health’s guidelines on recombinant DNA research. In a 1986 editorial in Science, Renato Dulbecco proclaimed that the United States should enter genomics with the same spirit that “led to the conquest of space.” This helped launch the Department of Energy’s genome project that same year. Lobbied by famed DNA sequencer Walter Gilbert, Watson rapidly embraced the Human Genome Project, especially the idea of putting it under NIH rather than DOE control. When the NIH Human Genome Project was launched in 1989, Watson was the natural choice to head it. Although this was a year after the launch of Europe’s Human Genome Organization (HUGO), much of the world’s excitement, especially throughout the early years, was due to Watson’s efforts to galvanize the general public, governments, and industry.

Prevention, the best cure
Even as the pharmaceutical industry produced ever more sophisticated drugs, there was a new emphasis in health care: preventive medicine. While new drugs were being designed, medicine focused on preventing disease rather than simply trying to restore some facsimile of health after it had developed. The link between exercise, diet, and health dates back 4000 years or more. More recently, medical texts from the 18th and 19th centuries noted that active patients were healthier patients. In the early 1900s, eminent physician Sir William Osler characterized heart disease as rare. By the 1980s, in many Western countries some 30% of all deaths were attributed to heart disease, and for every two people who died from a heart attack, another three suffered one but survived. During this decade, scientists finally made a definitive connection between heart disease and diet, cholesterol, and exercise levels. So what prompted this change in perspective? In part, the answer lies with the spread of managed care. Managed care started in the United States before World War II, when Kaiser-Permanente got its start. Health maintenance organizations (HMOs), of which Kaiser was and is the archetypal representative and which were and are controversial, spread during the 1980s as part of an effort to contain rising medical costs. One way to keep costs down was to prevent people from becoming ill in the first place. But another reason for the growth of managed care had to do with a profound shift in the way that diseases, particularly diseases of lifestyle, were approached. Coronary heart disease has long been considered the emblematic disease of lifestyle. During the 1980s, a causal model of heart disease that had been around since the 1950s suddenly became the dominant, if not the only, paradigm for heart disease. This was the “risk factor approach.” This view of heart disease—its origins, its outcomes, its causes—is a set of unquestioned and unstated assumptions about how individuals contribute to disease. Drawing from population studies about the relationship between heart 14

disease and individual diet, genetic heritage, and habits, as much as from biochemical and physiological causes of atherosclerosis, the risk factor approach tended to be a more holistic approach. Whereas the older view of disease prevention focused on identifying those who either did not know they were affected or who had not sought medical attention (for instance, tuberculosis patients earlier in the century), this new conceptual technology aimed to identify individuals who were at risk for a disease. According to the logic of this approach, everyone was potentially at risk for something, which provided a rationale for population screening. It was a new way to understand individual responsibility for and contribution to disease. Focusing on risk factors reflected a cultural preoccupation with the Suggested reading individual and the notion that individuals were responsible for their • Against the Odds: The Story of AIDS Drug Development, Politics, and own health and illness. As part of a wave of new conservatism Profits, Arno, P. S.; Feiden, K. L. against the earlier paternalism of the Great Society, accepting the risk (HarperCollins: New York, 1992) factor approach implied that social contributions to disease, such as • Science and Innovation: The U.S. the machinations of the tobacco and food industries, poverty, and Pharmaceutical Industry During the work-related sedentary lifestyles, were not to blame for illness. 1980s, Gambardella, A. (Cambridge Individuals were considered responsible for choices and behavior University Press: New York, 1995) that ran counter to known risk factors. • The Development of Medical While heart disease became a key focus of the 1980s, other disorders, Techniques and Treatments: From including breast, prostate, and colon cancer, were ultimately Leeches to Heart Surgery, Duke, M. subsumed by this risk factor approach. AIDS, the ultimate risk factor (International Universities Press: disease, became the epitome of this approach. Debates raged about Madison, CT, 1991) the disease and related issues of homosexuality, condoms, • A History of Medicine from abstinence, and needle exchange and tore at the fabric of society as Prehistory to the Year 2020, Duin, N.; Sutcliffe, J. (Simon and Schuster: the 1990s dawned.

Conclusion

The 1980s saw the resurgence of old perils, the emergence of new ones, and the rapid mobilization of new biotechnological tools to combat both. The decade also saw the surge and temporary fallback of entrepreneurial biotechnology. Throughout the decade, the hard technologies of genetic engineering and developments in immunology, automation, genomics, and computers— combined with the soft technologies of personal action and acknowledgment of risk and changes in government regulations and ways of thinking—affected the direction of biomedicine and pharmacy. The coming decade would see the explosive growth of these various trends—into both flowers and weeds.



London, 1992) Making Sense of Illness: Science, Society, and Disease, Aronowitz, R. A. (Cambridge University Press: New York, 1998)

15

Harnessing genes, recasting flesh
Introduction In the 1990s, the Human Genome Project took off like a rocket, along with many global
genomic initiatives aimed at plants, animals, and microorganisms. Gene therapy developed, as did the potential of human cloning and the use of human tissues as medicine. The new technologies provided hope for solving the failures of the old through a new paradigm—one that was more complex, holistic, and individualized. The changing view became one of medicines and therapies based on knowledge, not trial and error; on human flesh, not nature’s pharmacy. The new therapeutic paradigm evolved from an earlier promise of hormone drugs and the flowering of intelligent drug design. It grew from an understanding of receptors and from breakthroughs in genomics and genetic engineering. It found favor in the new power of combinatorial chemistry and the modeling and data management abilities of the fastest computers. Hope was renewed for a next generation of magic bullets born of human genes. As the 1990s ended, many genetically based drugs were in clinical trials and a wealth of genome sequences, their promise as yet unknown, led scientists to reason that the 21st century would be the biotech century.

Computers and combinatorial technology
Robotics and automation allowed researchers to finally break through a host of constraints on rational drug design. Achievements in miniaturization in robotics and computer systems allowed the manipulation of many thousands of samples and reactions in the time and space where previously only a few could be performed. They permitted the final transformation of pharmacology from the tedious, hit-and-miss science based primarily on organic synthesis to one based firmly on physiology and complex biochemistry, allowing explosive movement into rational drug discovery in both laboratory design and natural-product surveys. (And even when the technology remained hit-andmiss because of the lack of a compatible knowledge base, the sheer quantity of samples and reactions now testable created efficiencies of scale that made the random nature of the process extraordinarily worthwhile.) Combinatorial chemists produce libraries of chemicals based on the controlled and sequential modification of generally immobilized or otherwise tagged chemical starting blocks. These original moieties are chosen, under optimal knowledge conditions, for their predicted or possible behavior in a functional drug, protein, polymer, or pesticide. Developing the knowledge base for starting materials proved to be one of the greatest benefits of computer modeling of receptors and the development of computational libraries (multiple structures derived from computer algorithms that analyze and predict potentially useful sequences from databases of gene or protein sequences, or structural information from previous drugs). Here the search for natural products remained critical— for the discovery of new starting places. Finding new drug starting places, as much as finding drugs that were useful “as is,” became important in the 1990s as companies tried to tap into the knowledge base of traditional medical practitioners in the developing world through collaboration rather than appropriation. In this manner, several companies promoted the analysis of the biodiversity available in tropical rainforests and oceans. This “added value” of biodiversity became both the rallying cry for environmentalists and a point of solidarity for political muscle-building in the developing world. As the industrialized world’s demand for new drugs (and associated profits) increased, developing nations sought to prevent the perceived exploitation of their heritage. And just as previous sources of drugs from the natural world were not wholly ignored (even as the human genome became the “grail” of modern medical hopes), combinatorial approaches created a new demand for the services of traditional organic and analytical chemistry. Although computer models were beneficial, new wet chemistry techniques still had to be defined on the basis of the new discoveries in genomics and proteomics. They had to be modified for microsystems and mass production—for the triumph of the microtiter plate over the flask, the test tube, and the beaker. Drugs were still chemical products after all.

High-throughput screening
The vast increase in the number of potential drugs produced through combinatorial methods created a new bottleneck in the system—screening and evaluating these libraries, which held hundreds of thousands of candidates. To conduct high-throughput screening (HTS) on this excess of riches, new and more automated assays were pressed into service. Part of this move into HTS was due to the burgeoning of useful bioassays, which permitted screening by individual cells and cell components, tissues, engineered receptor proteins, nucleic acid sequences, and immunologicals. New forms of assays proliferated, and so did opportunities for evaluating these assays quickly through new technologies. 16

Researchers continued to move away from radioactivity in bioassays, automated sequencing, and synthesis by using various new tagging molecules to directly monitor cells, substrates, and reaction products by fluorescence or phosphorescence. Fluorescence made it possible to examine, for the first time, the behavior of single molecules or molecular species in in vivo systems. Roger Tsien and colleagues, for example, constructed variants of the standard green fluorescent protein for use in a calmodulin-based chimera to create a genetic-based, fluorescent indicator of Ca2+. They called this marker “yellow chameleon” and used it in transgenic Caenorhabditis elegans to follow calcium metabolism during muscle contraction in living organisms. Of particular promise was the use of such “light” technologies with DNA microarrays, which allowed quantitative analysis and comparison of gene expression by multicolor spectral imaging. Many genes are differentiated in their levels of expression, especially in cancerous versus normal cells, and microarray techniques showed promise in the discovery of those differentiated genes. Microarrays thus became a basic research tool and a highly promising candidate for HTS in drug development.

Genomics meets proteomics
Knowledge of details of the genetic code, first learned during the 1960s, achieved practical application during the 1990s on a previously unimaginable scale. Moving from gene to protein and back again provided an explosion of information as the human (and other) genome projects racked up spectacular successes. Planned at the end of the 1980s, the U.S. Human Genome Project and the world Human Genome Organization led the way. Started first as a collection of government and university collaborations, the search for the human genome was rapidly adopted by segments of industry. The issue of patenting human, plant, and animal genes would be a persistent controversy. Inspired by this new obsession with genomics, the 1990s may ultimately be best known for the production of the first complete genetic maps. The first full microorganism genome was sequenced in 1995 (Haemophilus influenza, by Craig Venter and colleagues at The Institute for Genomic Research). This achievement was followed rapidly by the genome sequencing of Saccharomyces cerevisiae (baker’s yeast) in 1996; Escherichia coli, Borrelia burgdorferi, and Heliobacter pylori in 1997; the nematode C. elegans in 1998; and the first sequenced human chromosome (22) in 1999. The entrance of industry into the race to sequence the human genome at the tail end of the decade sped up the worldwide effort, albeit amid controversy.

17

Hot on the heels of the genomics “blastoff” was the development of proteomics—the science of analyzing, predicting, and using the proteins produced from the genes and from the cellular processing performed on these macromolecules before they achieve full functionality in cells. Both proteomics and genomics rely on bioinformatics to be useful. Bioinformatics is essentially the computerized storage and analysis of biological data, from standard gene sequence databases (such as the online repository GenBank maintained by the NIH) to complex fuzzy logic systems such as GRAIL (developed in 1991 by Edward Eberbacher of Oak Ridge National Laboratory). GRAIL and more than a dozen other programs were used to find prospective genes in genomic databases such as GenBank by employing various pattern recognition techniques. Pattern recognition techniques were also at the heart of the new DNA microarrays discussed above, and they were increasingly used to detect comparative patterns of gene transcription in cells under various conditions and states (including diseased vs healthy).

Human biotechnology

In September 1990, the first human gene therapy was started by W. French Anderson at NIH in an attempt to cure adenosine deaminase (ADA) deficiency—referred to as bubble-boy syndrome—by inserting the correct gene for ADA into an afflicted four-year-old girl. Although the treatment did not provide a complete cure, it did allow the young patient to live a more normal life with supplemental ADA injections. Other attempts at gene therapy also remained more promising than successful. Clinical trials on humans were disappointing compared with the phenomenal successes in mice, although limited tumor suppression did occur in some cancers, and there were promising reports on the treatment of hemophilia. Jesse Gelsinger, a teenager who suffered from the life-threatening liver disorder ornithine transcarbamylase deficiency, volunteered for adenovirus-delivered gene therapy at a University of Pennsylvania clinical trial in 1999. His subsequent death sent a shock wave through the entire research community, exposed apparent flaws in regulatory protocols and compliance, and increased public distrust of one more aspect of harnessing genes. Using gene products as drugs, however, was a different story. From recombinant human insulin sold in the 1980s to the humanized antibodies of the 1990s, the successes of harnessing the human genome—whether “sensibly” (in the case of the gene product) or by using antisense techniques as inhibitors of human genes (in 1998 Formivirsen, used to treat cytomegalovirus, became the first approved antisense therapeutic)—proved tempting to the research laboratories of most major pharmaceutical companies. Many biotechnology medicines—from erythropoietin, tumor necrosis factor, dismutases, growth hormones, and interferons to interleukins and humanized monoclonal antibodies—entered clinical trials throughout the decade. Beginning in the 1990s, stem cell therapy held great promise. This treatment uses human cells to repair and ameliorate inborn or acquired medical conditions, from Parkinson’s disease and diabetes to traumatic spinal paralysis. By 1998, embryonic stem cells could be grown in vitro, which promised a wealth of new opportunities for this precious (and controversial) resource. Promising too were the new forms of tissue engineering for therapeutic purposes. Great strides were made in tissue, organ, and bone replacements. The demand for transplants, growing at 15% per year by the end of the decade, led to the search for appropriate artificial or animal substitutes. Cartilage repair systems, such as Carticel by Genzyme Tissue Repair, became commonplace. Patients’ cells shipped to the company were treated with Carticel, cultured, and subsequently reimplanted. Second-generation products permitted autologous cells to be cultured on membranes, allowing tissue formation in vitro. Several companies focused on developing orthobiologics—proteins, such as growth factors, that stimulate the patient’s ability to regenerate tissues. 18

BIOGRAPHY: The new human Perhaps the most significant and controversial “individual” in the 1990s was the new human—a creation of gene sequences, manipulable, copyable, and (to many) patentable in all its parts. She was the “human” of the Human Genome Project. The genetic determinism enthroned in this new persona became a powerful paradigm, from the discovery of the latest disease gene of the month, to the explosion of DNA fingerprinting and diagnostics, to a renewed belief in genes for homosexuality, violence, intelligence, and all manner of psychological states. The new human was a research tool transformed into social, political, economic, and philosophical dogma, the controversial ramifications of which will inevitably play out in the 21st century. In 1995, the Visible Human Project provided an apt symbol for this new paradigm—the human body, male and female, sliced and electromagnetically diced and viewable on a host of Web sites from every angle. To some, it was the ultimate sacrilege; to others, the flowering of the new technology.

Epicel, a graft made from autologous cells, was also developed by Genzyme Tissue Repair to replace the skin of burn victims with greater than 50% skin damage. In 1998, Organogenesis introduced the first FDA-approved, ready-to-order Apligraf human skin replacement, which was made of living human epidermal keratinocytes and dermal fibroblasts. Also undergoing research in 1999 was Vitrix soft tissue replacement—which was made of fibroblasts and collagen. By the end of the decade, artificial liver systems (which work outside the body) were developed as temporary blood cleansers providing detoxification and various digestionrelated processes. In many cases, such treatments allowed the patient’s own liver to regenerate during the metabolic “rest.” Such uses of cells and tissues raised numerous ethical questions, which were galvanized in the media by the 1996 arrival of the clonal sheep Dolly. Reaction against the power of the new biotechnology was not restricted to fears of dehumanizing humanity through “xeroxing.” The possibility of routine xenotransplantation (using animal organs as replacements in humans) came to the fore with advances in immunology and genetic engineering that promised the ability to humanize animal tissues (specifically, those of pigs) in ways similar to the development of humanized antibodies in mice. The issue of xenotransplantation not only raised fears of new diseases creeping into the human population from animal donors, but was seen by some as a further degradation of human dignity either intrinsically or through the misuse of animals. The animal rights lobby throughout the decade argued passionately against the use of animals for human health purposes.

The Red Queen’s race
Continuing the problems seen in the 1980s, old and new diseases were increasingly immune to the array of weapons devised against them. Like Alice in Through the Looking Glass, drug researchers had to run as fast as they could just to keep up in the race against bacterial resistance to traditional antibiotics (see Chapter 7). As the decade progressed, more diseases became untreatable with the standard suite of drugs. Various streptococcal infections, strains of tuberculosis bacteria, pathogenic E. coli, gonorrhea, and the so-called flesh-eating bacteria—necrotizing fasciitis, most commonly caused by group A streptococcus—all became immune to previously successful magic bullets. Patients died who earlier would have lived. As the problem manifested, pharmaceutical, software, and instrument companies alike turned to combinatorial chemistry and HTS technologies in an effort to regain the racing edge. AIDS remained a profoundly disturbing example of the failure of technology to master a disease, despite the incredible advances in understanding its biology that took place in the 1990s. Vaccine efforts occupied much of the popular press, and genetically engineered viral fragments seemed to be the best hope. But the proliferation of viral strains erased hope for a single, easy form of vaccination. Resistance to AZT therapy increased, and even the new protease inhibitors and so-called drug cocktails developed in the 1990s proved to be only stopgap measures as viral strains appeared that were resistant to everything thrown at them.

SOCIETY: Fighting the harness Dolly, the clonal sheep created from the cells of an adult ewe in 1996, was a bit of agricultural biotechnology that staggered the world—and in so doing provided some of the best evidence of the powerful fears raised by the new genetics and the concept of the new human. Governments and the public reacted with horror, seeing what might be a boon to pharmacology as the ultimate bane to religion and philosophy. Laws were passed protecting “human individuality” and human tissues, fueled in part by antiabortion sentiments as much as fears of the new technology. By the end of the decade, some of the most vociferous protests against harnessing genetics were because of perceived risks of the use and consumption of genetically modified (GM) foods. GM products were claimed to threaten the food chain by denaturalizing and disturbing the environment. Companies such as Monsanto faced worldwide opposition to the development and deployment of transgenic crops. At times, such protests even peripherally damaged the commercial development of transgenics for medicinal purposes. Transgenic plants and animals had literally become a growing source of vaccines, human gene products, and nutritional supplements. This development was part of the movement toward “nutraceuticals”— food crops that contain added value in the form of medicine. But despite the bad press, by the end of the century—in the United States at least—most of the corn and at least a third of the soybeans planted were genetically engineered with either herbicide or insect resistance. The medical uses of transgenic plants and animal products continued to expand, a phenomenon especially driven by the benefits of the new cloning technology.

19

Individual lives were prolonged, and the death rate in Western countries, where the expensive new drugs were available, dropped precipitously. But the ultimate solution to aids had not been found, nor even a countermeasure to its spread in the developing world and among poor populations of industrialized nations. The middle of the decade saw a resurgence of the “old” plague (bubonic) in India, and even polio remained a problem in the developing world. In Africa, Ebola resurfaced in scattered outbreaks—although it was nothing compared with the continental devastation caused by aids. In the rest of the world, fears of biological warfare raised by the Gulf War continued. Vaccination efforts were stepped up for many diseases. The Assembly of the World Health Organization set a global goal in 1990 of a 95% reduction in measles deaths in 1995 compared with preimmunization levels. By the deadline, estimated global coverage for measles vaccine had reached 78%, at the same time as the industrialized world experienced a backlash against vaccination because of concerns about adverse side effects. Vaccine technology continued to improve with the development of recombinant vaccines for several diseases, new efforts to produce vaccines for old scourges such as malaria, and new nasal delivery systems that stimulated the mucosal-associated antibody system. DNA vaccines—the injection of engineered plasmids into human cells to stimulate antigen production and immunization—were first described in a 1989 patent and published in 1990 by Wolff, Malone, Felgner, and colleagues. They entered clinical trials in 1995. Although one editor of Bio/Technology called this the Third Vaccine Revolution, by the end of the decade the reality of this expansive claim remained in doubt, especially because of the continuing debate over genetic engineering. Efforts to develop food-based vaccines through the production of transgenics engineered with the appropriate antigens continued. This research stimulated studies on mucosal immunity and efforts to enable complex proteins to cross the gut–blood barrier. Even with these technological developments, the decade ended with the negatives of ever-expanding disease problems, exacerbated by predictions that global warming would lead to new epidemics of insectborne and tropical diseases. However, a note of optimism remained that rational design and automated production technologies would ultimately be able to defeat these diseases.

High tech and new mech
To improve the front end of rational drug design and to foster the use and growth of knowledge bases in genomics and proteomics, many old and new technologies were adapted to pharmaceutical use in the 1990s. In the 1990s, the use of mass spectrometry (MS) for bioanalytical analysis underwent a renaissance, with improvements such as ultrahigh-performance ms using Fourier-transform ion cyclotron resonance (FT-ICR MS) and tandem-in-time (multidimensional) MS for biological macromolecules. Improved techniques such as peak-parking (reducing the column flow rate into the mass spectrometer the instant a peak is detected, to allow samples to be split and analyzed by multiple MS nearly simultaneously) added several dimensions that were previously impossible. These changes improved the ability to analyze the complex mixtures required in studies of cellular metabolism and gene regulation. Results from multidimensional runs were analyzed by increasingly sophisticated bioinformatics programs and used to improve their knowledge base. In combination with HPLC and various capillary electrophoretic systems, ms became part of a paradigm for pharmaceutical R&D as a valuable new approach for identifying drug targets and protein function.

20

Equivalently, the development of multidimensional NMR techniques, especially those using more powerful instruments (e.g., 500-MHz NMR) opened the door to solving the structure of proteins and peptides in aqueous environments, as they exist in biological systems. The new NMR techniques allowed observations of the physical flexibility of proteins and the dynamics of their interactions with other molecules—a huge advantage in studies of a protein’s biochemical function, especially in receptors and their target molecules (including potential drugs). By viewing the computer-generated three-dimensional structure of the protein, which was made possible by the data gathered from these instruments, the way in which a ligand fits into a protein’s active site could be directly observed and studied for the first time. The threedimensional structure provided information about biological function, including the catalysis of reaction and binding of molecules such as DNA, RNA, and other proteins. In drug design, ligand binding by a target protein was used to induce the ultimate effects of interest, such as cell growth or cell death. By using new technologies to study the structure of the target protein in a disease and learn about its active or ligand-binding sites, rational drug design sought to design inhibitors or activators that elicited a response. This correlation between structure and biological function (known as the structure–activity relationship, or SAR) became a fundamental underpinning of the revolution in bioinformatics. In the 1990s, the SAR was the basis by which genomics and proteomics were translated into pharmaceutical products.

The “new” Big Pharma

Ultimately, the biggest change in the pharmaceutical industry, enabled by the progression of technologies throughout the century and culminating in the 1990s, was the aforementioned transformation from a hit-and-miss approach to rational drug discovery in both laboratory design and natural-product surveys. A new business atmosphere, first seen in the 1980s and institutionalized in the 1990s, revealed itself. It was characterized by mergers and takeovers, and by a dramatic increase in the use of contract research organizations— not only for clinical development, but even for basic R&D. Big Pharma confronted a new business climate and new regulations, born in part from dealing with world market forces and protests by activists in developing countries. Marketing changed dramatically in the 1990s, partly because of a new consumerism. The Internet made possible the direct purchase of medicines by drug consumers and of raw materials by drug producers, transforming the nature of business. Direct-to-consumer advertising proliferated on radio and TV because of new FDA regulations in 1997 that liberalized requirements for the presentation of risks of medications on electronic media compared with print. The phenomenal demand for nutritional supplements and so-called alternative medicines created both new opportunities and increased competition in the industry—which led to major scandals in vitamin price-fixing among some of the most respected, or at least some of the biggest, drug corporations. So powerful was the new consumer demand that it represented one of the few times in recent history that the burgeoning power of the FDA was thwarted when the agency attempted to control nutritional supplements as drugs. (The FDA retained the right to regulate them as foods.)

TECHNOLOGY: Let there be light One of the most enabling technologies for biological analysis developed in the 1990s uses one of the most basic sensory inputs— visible light. New fluorescence and phosphorescence techniques were developed for gene and protein sequencing, in vivo and in vitro activity analysis, DNA microarrays, and even the evaluation of transgenic animals and plants. No longer must researchers rely on the radioactive traces, chemical precipitation, or electrical properties that recent technologies have used. Rather, they hark back to an almost primitive reliance on color, like the biological stains and dyes used in early microscopy. Although computers are used to provide sensitive quantification of color differentials, the eye of the scientist once more keys on colored dots and spots, glowing cells and organelles, and colored traces on recording strips. For the synthetic drug industry, founded in coal-tar dyes before the Pharmaceutical Century began, it might be called a return to colored roots.

Promise and peril
At the start of the Pharmaceutical Century, the average life expectancy of Americans was 47. At century’s end, the average child born in the United States was projected to live to 76. As the 1900s gave way to the 2000s, biotechnology provided the promise of even more astounding advances in health and longevity. But concomitant with these technological changes was a sea change in the vision of what it was to be a human being. In the 19th century, the natural world was the source of most medicines. With the dawn of magic bullets in the 20th century, complex organic chemistry opened up a world of drugs created in the laboratory, either modified from nature or synthesized de novo. As the Pharmaceutical Century progressed, from the first knowledge of human 21

hormones to the discovery of the nature of genes and the tools for genetic engineering, the modern paradigm saw a recasting of what human flesh was for—suddenly it was a source of medicines, tissues, and patentable knowledge. Humankind, not the natural world, became the hoped-for premier source of drug discovery. With genes and chemicals suddenly capable of manipulating the Suggested reading warp and woof of the human loom—both mind and body alike—the • Biotechnology: Science, Engineering and Ethical Challenges for the human pattern seemed to become fluid to design. According to Twenty-First Century, Rudolph, F. pessimists, even if biotechnology were not abused to create B., and McIntire, L. V., Eds. (National “superhumans,” pharmaceuticals and health care could become the Academy Press: Washington, DC, greatest differentiators of human groups in history—not by genetic 1996) races, but by economic factors. The new knowledge was found in the • A Practical Guide to Combinatorial highest and most expensive technologies, and often in the hands of Chemistry, Czarnik, A. W., and those more interested in patents than panaceas. Dewitt, S. H., Eds. (American Yet with this peril of inequality comes the promise of human Chemical Society: Washington, DC, transformation for good. According to optimists, biotechnology— 1997) especially the use of transgenic plants and animals for the production • Molecular Biology and of new drugs and vaccines, xenotransplantation, and the like— Biotechnology: A Comprehensive promises cheaper, more universal health care. Meanwhile, lifestyle Desk Reference, Meyers, R. A., Ed. (John Wiley & Sons: New York, drugs—pharmaceuticals for nonacute conditions such as sterility, 1995) impotence, and baldness—also have emerged as a fast-growing • The Organic Chemistry of Drug category. Design and Drug Action, Silverman, From aspirin to Herceptin—a monoclonal antibody that blocks the R. (Academic Press: San Diego, overexpressed Her2 receptor in breast cancer patients—from herbal 1999) medicines to transgenic plants, from horse serum to • Using Antibodies: A Laboratory xenotransplantation, from animal insulin to recombinant human Manual, Harlow, E., and Lane, D. growth hormone, the Pharmaceutical Century was one of (Cold Spring Harbor Laboratory transformation. It is too soon to predict what pattern the process will Press: Cold Spring Harbor, NY, weave, but the human loom has gone high tech. The 21st century will 1999) be a brave new tapestry.

22

The Next Pharmaceutical Century
Introduction
Just what does the future hold for pharmaceutical science and technology? It’s really anybody’s guess, but that’s precisely what makes prognostication fun. Although people are reluctant to commit to specific time frames, that hesitancy doesn’t stop them from making predictions. When asked for their forecasts, a wide variety of experts made similar predictions, which indicates that maybe they’re onto something. Richard Klausner, director of the National Cancer Institute (NCI), notes the difficulty of making predictions. “It’s always amazing how, for some things, we dramatically underestimate how far they are in the future, and for other things, we dramatically overestimate them. We’re not that bad at predicting things that might be part of the future, but we’re really bad at predicting the timing and kinetics and path to them.”

Off to a good start
The 21st century is opening with the sequencing of the human genome. On June 26, Celera Genomics (Rockville, MD) and the international Human Genome Project jointly announced that they had completed 97–99% of the human genome. Using computer algorithms, the next step would be to reassemble the jumble of DNA fragments into a whole sequence. The excitement—and hype—surrounding human genome sequencing stems from hope that the genome can be used to help determine the underlying causes of diseases. Once the genome yields its secrets, the goal is to treat diseases and not merely symptoms, and to provide cures and not just palliative therapies. The human genome sequence is an important step, but is it the answer to everything? Some people—William Haseltine, chairman and chief executive officer of Human Genome Sciences (Rockville, MD), for one—say that the elation over the raw genome sequence is misplaced. “I distinguish between knowledge of the genome and knowledge of genes,” he says. “We essentially had in our possession all the human genes by 1995. The genome project itself is a relatively meaningless footnote. If you’re to extract information about genes from the genome, you have to know the genes first. However, the fact that our technologies have allowed us to isolate and characterize a useful form of virtually all human genes is very significant.” Douglas A. Livingston, the director of chemistry at the Genomics Institute of the Novartis Research Foundation (La Jolla, CA), expresses a similar view, although he doesn’t speak quite so harshly as Haseltine. He compares the situation to that in the science fiction cult classic The Hitchhiker’s Guide to the Galaxy, in which the hero hitchhikes through the galaxy trying to find “the answer to life, the universe, and everything.” When he finally gets the answer, it turns out to be, simply and literally, 42. “I think of the sequencing of the human genome in those terms. AGCTTTT is not the answer to life, the universe, and everything,” Livingston says. “In order to make some sense of that, you start to get into the issues of what it means. What are the products? How do they function? How do they interact?” Kazumi Shiosaki, vice president for drug discovery at Millennium Pharmaceuticals (Cambridge, MA), agrees that sequencing the genome is just the beginning. “That’s just the nucleotide sequence. The next step of what the sequence means—does it represent a membrane protein? does it represent an enzyme? does it represent a structural protein?—is going to consume a lot of people’s time in terms of trying to understand the function of each of these. Layered on top of that, the challenge for the pharmaceutical industry is to say that we know which ones are misbehaving themselves to cause disease.” One caveat, however, is that knowledge of the human genome cannot lead to a cure for all diseases. “We are in a war with bacteria and viruses, and that war is waged with medicines that are related to the biology of those infective agents,” says Ronald C. Breslow, professor of chemistry at Columbia University (New York). “It is not clear how much an understanding of the human genome can contribute to this part of pharmaceutical science.” Genomic information has already found its way into the drug discovery process, primarily as a way to identify new targets. However, seeing genomics simply as a way to identify drug targets is a “very limited view of genomics,” says Wolfgang Sadee, professor of biopharmaceutical sciences and pharmaceutical chemistry at the University of California–San Francisco. Sadee, who is also editor-in-chief of PharmSci, the electronic journal of the American Association of Pharmaceutical Scientists, says, “We’re entering a new era in therapy, in medicine, understanding of health, understanding of disease. Just finding the target in fact is a difficult thing. Just because you have a new gene doesn’t mean that you have a new target. Target identification is a science in its infancy because we cannot model these complex systems very well.” Breslow points out that “even when genes play a major role, we must not ignore the function of proteins in regulating the expression of genes.” For example, Breslow’s group is working on developing anticancer agents that 23

arrest the development of tumors by causing the cells to differentiate. (A hallmark of cancer is cells that proliferate indefinitely without differentiating into a particular cell type.) “Our novel anticancer compounds are targeted to a protein that helps turn genes on by modulating the binding of DNA to histones,” Breslow says. “We understand how our compounds work, and even have an X-ray structure showing one of our compounds bound to the regulatory protein, which is called histone deacetylase. As the result of binding to this protein, our compounds turn on genes that cause cellular differentiation.” Modern medicinal chemistry is more focused on the interactions of small molecules with proteins than with genes, which code for the synthesis of those proteins. “Many of the medically relevant proteins have already been identified and will continue to be important targets for modern therapies even after the human genome is fully sequenced,” Breslow says. The human genome sequence may help scientists finish the task of identifying these proteins. Unraveling the human genome will allow “the early phase of drug discovery involving the identification of safe and effective agents that modulate the function of human proteins” to be completed within the next 50 years, predicts Stuart L. Schreiber, scientific director of the Center for Genomics Research at Harvard University. He doesn’t rule out the possibility of a much shorter timeframe. “This logically flows from the appreciation that the human genome is finite, with only 150,000 or so genes. This means we will want roughly 300,000 small molecules, an activator and inactivator of each gene product, with effectively complete specificity within the context of a human patient,” Schreiber says. Some multifunction proteins will require effectors for each of the individual functions, but “housekeeping” proteins will not need any effectors. Schreiber estimates that about 5000 of these small-molecule effectors already exist and some are comparable to a gene knockout. However, he adds, “If even 3000 existing small molecules is an accurate estimate, we are already 1% toward that goal.”

24

More than binders

One change already taking place is the integration of drug discovery and activities that used to be associated with drug development. Recently, pharmacology has been decoupled from the early stages of drug discovery. “There’s been a swing toward discovering binders to targets,” Livingston observes. “That doesn’t necessarily correlate to a drug.” “When I started in this business,” says Livingston, “you went and made a compound. You gave it to a biologist, and he fed it to an animal. The animal did one of three things: It got better, it stayed the same, or it died. You got the results and went back and made another compound. It was slow, but at the end of the day you had a drug.” Shiosaki says that early-stage drug discovery has moved away from animal models to in vitro cell assays to identify candidates. “One can go very well using just those tools and come up with compounds that are very potent and very selective. You then have invested many chemists’ time—up to a year in some instances—and come up with one of these exquisitely potent compounds in a biochemical assay,” she says. “Then the first time you take it into an animal model, you find out that the compound has absolutely no oral bioavailability, for instance. Then you’re back to square one.” However, Shiosaki says that scientists have become clever about engineering cell lines that can be used to assay various metabolic functions. For instance, cell lines can be used to mimic absorption by the gut or metabolism by liver enzymes such as cytochrome P-450. “A chemist might not start on a structure that was the most potent— although traditionally that was what was done, because that was the only data you had. Rather, a chemist would maybe work on a compound that had a fair-to-middling degree of potency but the right physical chemical properties, or at least the desirable ones, that would ensure a higher success in becoming a drug.” Shiosaki says that finding a potent compound is only a tiny fraction of the making of a drug. “Finding a small-molecule drug that is potent and selective for a given target—say, an enzyme or a receptor—any good medicinal chemist worth their salt can pretty much do that nowadays. That’s only 5 or 10% of the game. The other 90 or 95% is, how do you make a drug out of that?” Shiosaki also says that companies are more concerned about the characteristics that make compounds good drugs rather than simply good pharmacological agents. Compounds that are “hits” in a primary biochemical assay—say, 1000 compounds out of a 10,000-compound library that can inhibit a particular enzyme—are profiled with other assays. For example, a human colon cell line known as Caco-2 provides an indication of membrane permeability. In addition, the Caco-2 assay has some transporter and efflux mechanisms that mimic some of the pathways in the gut, Shiosaki says. In other tests, compounds are assayed against panels of metabolizing enzymes. “It becomes a triangulation game,” Shiosaki says. “It’s like a Venn diagram, where you have a subgroup of compounds, the 1000 compounds, that hit with a certain potency. Within this is another Venn diagram of what compounds are permeable across a membrane. And another one in which there’s not significant metabolism by various liver enzymes. And another Venn diagram of compounds that cross the blood–brain barrier effectively. There should be an intersection, hopefully, of a reduced number of compounds that actually have the desired pharmacology, that is, they inhibit that enzyme but also have the desirable physical chemical properties that might make them more likely to succeed as a drug. It’s really helping to reduce the attrition rate in terms of the going-upthe-wrong-tree phenomenon.” With all the in vitro assays that are now available, Shiosaki says that the “big trophy in the sky” is an assay to help determine toxicity. RNA expression profiling shows promise in this area, although it won’t be entirely predictive. In this technique, DNA arrays are used to detect the mRNA that is expressed in a given tissue before and after being 25

The genome cometh The fast-approaching completion of the human genome sequence brings with it incredible promise for the future. Our expanding knowledge of the genetic underpinnings of disease will help us find ways to cure and prevent disease. We may even find ways to improve on our basic architecture. Nanorobots injected into the body may rebuild broken-down or worn-out body parts with materials more durable than our own native cells and proteins. Other nanorobots may be able to diagnose and eradicate now-fatal diseases such as heart disease and cancer. These visions may sound like the stuff of science fiction, but they aren’t as far off as they seem. In April 2000, the National Cancer Institute and the National Aeronautics and Space Administration formally agreed to work together developing biomedical technologies that can detect, diagnose, and treat disease. Such devices are on the way, but we will see other advances first. Using the human genetic blueprint as a jumping-off point, we will find new and better targets for pharmaceuticals. Gene therapy will become a widespread reality. Protein therapeutics—delivered directly or by gene therapy—will represent a larger percentage of our pharmacopoeia.

subjected to a particular compound. Some genes are known to be upregulated in response to toxic compounds. “We’re asking for various techniques that would help us prioritize or give us an indication that there may or may not be issues of toxicity associated with a compound,” Shiosaki says, adding that much work is still needed to assess the validity of such technologies.

Protein therapeutics
“Will protein drugs be a prominent part of the pharmaceutical library?” asks Joffre Baker, vice president of discovery research at Genentech (South San Francisco, CA). “The answer over the next 10–20 years is ‘Absolutely!’ I’m more confident about that today than I was 15 years ago by a long shot.” Haseltine predicts that a “flood” of protein, antibody, and peptide Germ-line therapy drugs will hit the market in 10 years. A decade after that, he says, One of the stickier ethical issues for such drugs will constitute half of all new drugs introduced to the the future is the question of germ-line market. Baker forecasts that more than 80% of those protein drugs therapy—tinkering with the genes in will be antibodies. One use of antibodies will be as “molecular sperm and eggs. Although the sponges” to prevent protein–protein interactions. technical ability probably exists to Compared with small-molecule drugs, antibodies are very specific attempt germ-line therapy, former and are less likely to cause toxicity based on factors other than the Merck researcher C. Russell mechanism of action. “All sorts of orally available small molecules Middaugh doesn’t believe that the target all sorts of things, but they also toxify the liver and have drug– efficiency is high enough. “You need drug interaction problems. They interfere with cytochrome P-450, and an efficiency approaching 100% if that’s not mechanism-based toxicity. It’s like an innocent bystander you’re going to do human gene getting blown away,” comments Baker. “From the point of view of a modification. The ethical questions clean safety profile, antibodies are extremely attractive. You can are obvious, and they’re huge. I don’t design them to be incredibly specific with incredibly high affinity.” think the ethical questions are that great for a child that’s going to be Small molecules as tools born with muscular dystrophy,” an “You’re going to find that there’s an evolution toward more of a inherited disease marked by proteomics approach to genomics,” Livingston says. Proteomics is degeneration of muscle fiber. “If you the study of the entire complement of proteins produced in a cell or could do something about that, very tissue. “I think everybody agrees that proteomics is a more direct few would dispute that that approach way [than mRNA expression profiling] of looking at systems, if the has merit. The problem arises when tools existed to do that.” Livingston believes that a major challenge you do things to the germ line for in the near future is to provide “powerful tools” for proteomics. reasons other than very serious Livingston predicts that one of those powerful tools will be smallgenetic disease. One characteristic of molecule chemistry, using vast libraries of compounds. “If you find drugs that none of us likes but we all something that interacts specifically with a target or even with a class know exists is the potential for abuse. of targets, you wind up with something that, based on its structure, If you can do something, there will be not only defines the class of protein that’s critical in a given pathway people who will do it for other than but also is a tool for target validation. I think there’s huge power in purely therapeutic reasons.” small molecules to apply to this whole critical functional genomics problem,” Livingston says. In addition, a small-molecule approach to functional genomics brings scientists that much closer to identifying potential small-molecule therapies, he says. Greater integration of chemistry into biological research will allow biology to be studied in a “less reductionist way,” Klausner says. “The next dramatic set of tools that will change biological research will be the development, annotation, and incorporation of small-molecule probes—both as universal perturbational agents to query biology and as imaging agents—so we can, in real time and with some ability to ask meaningful questions, look at biological processes in their natural settings.” Klausner says that it will require a “sociological change” to incorporate chemistry in biological research. It will also require the development of informatics tools. Molecular analysis will foster better descriptions of disease states. “We still have, in many cases, relatively primitive descriptions of disease states as clinical entities and not molecular processes. Cancer is a great example. The ability to accurately and appropriately molecularly classify and describe diseases has been going on for quite a while, but it’s going to ramp up quite dramatically,” Klausner says.

Gene therapy
Gene therapy has received much negative publicity in the past year, but researchers still believe that it will play a major role in the future. “Gene therapy is simply a different way of delivering a protein,” Baker points out. “There’s every reason to believe that gene therapy will work. It’s just a question of when it will work. I’d be surprised if over 26

the next 10 years we don’t see some protein therapeutics that are delivered via gene therapy.” He thinks that gene therapy will just be another part of the “toolbox for protein therapeutics.” C. Russell Middaugh, a professor at the University of Kansas (Lawrence, KS), spent 10 years at Merck Pharmaceuticals (three as a member of the company’s gene therapy group). Middaugh expects that the first major success will come in 5 to 10 years. Before that success will be realized, the vectors used to deliver genes must be improved. “Everyone feels that the ideal vector to deliver genes to cells has yet to be identified. There are sort of two views. One is that different targets will require different vectors. Therefore, work on many different systems is appropriate,” Middaugh says. “Probably a less frequently espoused view, but one that I think people secretly hold, is that there will probably be one or two efficacious vectors that will become widely used.” The vector currently considered the “leading candidate” is the adeno-associated virus, Middaugh says. “There have been some spectacular results in animal models that feature long-term expression of gene products at therapeutic levels. One of the big disappointments has been that the very spectacular success with DNA-based vaccines, in which a wide variety of animals were shown to generate significant immune responses using this technology, has not so far propagated into humans.” Gene therapy research is being conducted in academic research Doing the math laboratories, small- to medium-sized biotechnology companies, and Most chronic diseases have a variety large pharmaceutical companies. However, he says, “most of big of genetic factors underlying them. pharma is taking a wait-and-see attitude toward this. They don’t Different drugs should be used to really want to assume the major risks. They’re willing to fund some treat those different causes of of these smaller companies and help them, but they don’t want to disease, UC-San Francisco professor make any major commitments.” Wolfgang Sadee suggests. “Instead What sort of impact will gene therapy have on pharmaceuticals? “In of treating a million patients with the terms of longer time periods in human health, major impacts come best drug available at the present from things like vaccines and new preventive strategies, and perhaps time against this type of symptom, from more novel therapies such as gene therapy,” Middaugh says. you would break down the million “We know that you can take small molecules and inhibit enzymes in patients into 10 groups of 100,000 certain metabolic pathways and receptors on cells, and impact the patients and select the drugs therapeutic courses of disease. But I don’t think that small molecules accordingly. Or maybe break it down have the potential to impact disease the way that genetic therapy into groups of 10,000 patients. Then it does. Genetic therapy is going right inside the cell and producing the really does become individualized.” natural substances of the cell—the proteins—and intervening in a The groups would be defined by much more specific and much more potent fashion.” disease-susceptibility genes that Middaugh believes that before we will see success with gene therapy, determine the cause and likely course there must be a “mental transition in terms of the treatment of the of the disease. current vectors from ill-defined biological agents to well-defined “Can we develop the truly best drug, pharmaceutical agents—things that we understand at the very level of the blockbuster drug, and treat more the molecules that make them up. When that transition occurs, it’s patients?” Sadee asks. “Or, is it worth going to dramatically enhance gene therapy.” our while to generate 20 drugs for the “We’re getting there,” he says, “and with new developments in rapid same symptoms and then apply them DNA characterization, new developments in mass spectrometry to optimally? If we treat a larger patient characterize the vehicles, and the use of combinations of population, we are incurring more spectroscopic techniques to characterize the size and the shape of the easily the risk of serious side effects. vectors we make, we’re ultimately going to be able to define these It costs $500 million to make the drug. vectors in molecular detail and treat them just as we would small If we’re treating smaller patient molecules. To get the kind of therapeutic effect that you’d need for a populations, we still have to go commercial drug, [we must] transition from treating these things as through a drug approval process. It viruses or heterogeneous globs to treating them as well-defined still may cost almost $500 million to biological, biochemical, or biophysical entities.” treat a smaller population. That Harvard’s Schreiber predicts that gene therapy will be combined with means that the laws have to be molecule therapy in which the small molecule is used to regulate the changed or expectations have to therapeutic gene. “Somatic gene therapy will become common and change.” effective in the coming years, but germ-line therapy is an eventual reality as well,” he says. Schreiber expects that germ cells will be equipped with extra copies of cancer-fighting and anti-aging genes that can be turned on with a small-molecule drug when needed. 27

Toward personalized medicine
Pharmaceuticals will be more personalized in the future, thanks to a growing field known as pharmacogenomics, which focuses on polymorphisms in drug-metabolizing enzymes and the resulting differences in drug effects. Slight genetic differences—sometimes as small as a change in a single base pair—can affect the way an individual metabolizes drugs. Pharmacogenomics will identify the patient population most likely to benefit from a given medication. “The whole industry is moving to a point where many people think fewer and fewer patients are going to get a given drug,” says Baker. “In fact, the patients who do receive that drug will respond better because of the ability to profile patients in terms of their gene expression profiles.” Shiosaki believes that all patients should have their own “gene on a chip.” Profiling a drug against a patient’s gene on a chip would indicate whether the medication would have an adverse or positive effect. Although pharmacogenomics will have its biggest effect in the decision to prescribe certain medications, it may also help determine which pharmaceuticals are developed. “If the target you’re looking for has a significant variability in the general population, that may not be a great target to go for,” Shiosaki says. “Already, that information is going to help you preselect what targets to work on.”

Finding the fountain of youth
Haseltine predicts that both regenerative and rejuvenative medicine will be important in the 21st century. “The first half of the century will be dominated by the use of human genes, proteins, antibodies, and cells to replace, repair, and restore what has been damaged by disease, injured by trauma, or worn by time.” Haseltine estimates that 10– 15% of all new drugs will be human genes, proteins, or antibodies within 10 years; by the next 20 years, he suggests that figure will be 30–50%. However, he thinks the second half of the century will be even better. “The most important advance in the 21st century will be the introduction of atomic-scale prostheses to repair and restore human body function,” Haseltine says. “That will come in the second half of the century, and it will be tied to the most important revolution of the next century, which will be atomic-scale engineering.” Haseltine differentiates between regenerative and rejuvenative therapy in terms of the materials that are used. Regenerative therapy will use naturally occurring biologic substances, whereas rejuvenative therapy will use human stem cells and synthetic engineered substances. “We will replace our body parts with a more durable substance and extend human performance in almost any area,” he says. “The fusion of atomic-scale engineering technology with our bodies will enormously enhance human performance. “We have already made some of the fundamental advances that are needed, which are the isolation and characterization of a complete set of human genes and discovery of materials that are compatible with our bodies for regrowing organs. It’s more a question of engineering and execution at this point,” Haseltine says. Tissue engineering—taking cells to restructure or rebuild damaged or congenitally defective tissues—is an important part of Haseltine’s vision of regenerative therapy. “The first tissues are just now being reimplanted,” he notes. “That will grow over the next 5 to 15 years to be a major business. We will begin with blood vessels, cartilage, bone, bladder, trachea, and skin, moving on to more complicated organs like the liver and kidney. Twenty years from now, a number of major organs should be reimplanted, and I think by 30 years from now more complicated organs including the heart and lung can be transplanted.” From transplants, Haseltine sees medicine moving toward the use of stem cells as medicine to rejuvenate the body. “That requires very fundamental breakthroughs in our understanding of how stem cells arise during embryogenesis, what controls them, and how they can be used as medicine. I don’t see stem cells being used as a major form of medicine for at least 10 years, and I don’t see the main implication for another 30 or 40 years because of a wide variety of difficulties.” However, he admits that stem cell research may progress more rapidly than he suspects. A particular stumbling block is the ethics of embryonic stem cell research. Human embryos must be destroyed to harvest the cells. There are concerns in some circles that such research could encourage abortions. However, stem cells can be obtained from leftover embryos obtained during fertility treatments. Other people also see regenerative medicine playing a role in the future. “If you consider heart attacks, there is dying muscle tissue. The extraordinary excitement was about growing new blood vessels that would supply the remaining muscle. What was overlooked was that the muscle has degenerated and these new blood vessels don’t regenerate the muscle tissue,” Sadee says. “In regenerative medicine, we would be able to take stem cells and train them to be cardiac muscle cells and replace the muscle tissue. I think stem cells and anything related to them are going to be of extraordinary importance in the future.” Schreiber sees stem cells being used in combination with small molecules. “Advances in stem cell research, in combination with advances in our ability to discover small molecules that activate specific differentiation pathways, 28

will allow replacement organs to be grown in culture, beginning with the patient’s own stem cells or genetic material.” Like Haseltine, Klausner also sees nanotechnology playing a role in medicine. “Ultimately, what I think is a fantastic challenge is to link molecular sensing technologies with nanotechnology for the idea of using such molecular machines to remotely sense molecular changes, know what they are, and know where they are. One can ultimately imagine the incorporation in a single molecular platform of sensing, signal generation, external decisionmaking, and local therapy. I don’t think what we’re talking about violates laws of physics as we understand them, but it is very far off in terms of what we’re capable of doing.”

On to the future
One question is whether the future will bring true cures or simply more palliative therapies. “The difference between palliation and cure relates to understanding the effect of intervening at a molecular target with a phenotype. It’s going to require putting the system together, not just having the components,” Klausner says. “We need to understand the relationship between the components and the disease processes—the processes that lead to developing the disease, the symptoms of the disease, and the fundamental nature of the disease itself.” Beyond that, Klausner asserts that palliation and cure lie along a continuum. “Our goal should be to have the most effective approaches to preventing or curing disease that we can. Whether that comes through what we might call palliation or fundamental cure, my feeling is, ‘Whatever works.’ I think you should move toward what’s most effective and affordable and accessible to everyone.”

29

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close