Uncle Sam: Scientist
During the last century, America soared to the top of the world in science and technology. The United States outstripped all other countries in the number of science-related Nobel prizes awarded, in bringing new biotechnical products to the market, and in the amount of money spent on basic research.
Here, at the beginning of the 21st century, however, Uncle Sam’s position of strength is in peril, hamstrung by the triple-whammy of reduced federal funding for basic research, a flagging biotech industry, and a public education system so rife with inadequacies that it is failing to turn on young people to careers in the sciences.
Ironically, at the same time that the United States is trying to hold on to its frontrunner status, countries like China and Singapore, among others, are muscling up and making massive federal commitments to support research and technology. These nations are banking on a concept that some now see as the backbone of economic wealth in the new century: innovation. Investment in innovation creates new industry, strengthens the national education system and improves the well being of the citizenry at large.
“The genius of the American system is the ability to create new things based on new knowledge,” adds Ellen Wright Clayton, M.D., J.D., professor of Pediatrics and Law at Vanderbilt University.
“And if we don’t have that, then I think it really affects the vitality of the country. I think that it really is based on creativity, and without that, the heart’s gone.”
J. Michael Bishop, M.D., chancellor of the University of California, San Francisco (UCSF), is optimistic that, given its past, the nation will rise to the challenge of its future – in biomedicine as well as the equally competitive areas of engineering and computer science.
“For the moment, we still have the front rank in the world in biomedical science,” says Bishop, who shared the 1989 Nobel Prize in medicine with Harold Varmus, M.D., for their discovery of oncogenes.
In addition, Americans have shown unique insight by forging alliances between academicians and businesspeople, translating fundamental research findings into products that are useful at the bedside and in the marketplace. Plus, the United States has fostered an enclave of venture capitalists who helped create a thriving trade in biotechnology.
“No other country has had a community of investors that even vaguely resembles the biotech venture capitalists in the U.S.,” Bishop says. “We’ve dominated the scene.”
Americans tend to be a practical people, and their scientific history reflects that. Prior to the 19th century, American science was famously empirical, unlike the science going on in most of Europe, which was largely based on theory and experimentation.
People in the United States applied their ingenuity to the high-tech endeavors of the day, such as the steel industry, in building bridges, roads and canals, in solving health problems caused by contaminated water systems. Yet while the physical and engineering sciences were booming, Americans lagged behind their European colleagues in the medical sciences, such as biology and physiology. Medical schools served as certificate factories, requiring only four months of study or less, and most were unaffiliated with universities or hospitals.
In the early 20th century, Abraham Flexner, Ph.D., changed that scenario. Charged by the Rockefeller Foundation to evaluate all the medical school programs in the country, in 1910 he issued the “Flexner Report,” demanding that medical education require a rigorous course of study in the basic sciences and classical languages.
In response, the Rockefeller and other philanthropic foundations donated $154 million to improve the quality of American medical education. These unprecedented dollars from the private sector spurred states to begin channeling money into universities, launching an era of governmental, institutional and foundational support for biomedical science that remains in effect today.
Also in the early 1900s, the agency that was the precursor to the National Institutes of Health (NIH) moved from strictly focusing on infectious and contagious scourges like cholera and yellow fever, to developing vaccines and antitoxins, and then to investigating noncontagious problems, like pellagra, that were caused by dietary deficiencies.
Some of these needs arose as immigrants, escaping wars, oppression and poverty overseas, poured onto American shores, causing public health issues to bubble up to the surface of the American consciousness. Many of these immigrants also brought with them something else – a background based in scientific theory.
Explains Cyrus Mody, Ph.D., an assistant professor at Rice University who teaches about the history of innovation and technology, “During the 1920s and ’30s, some of those immigrants – and especially the children of immigrants -- really started to penetrate into the American scientific elite.”
For example, the three American physician-researchers who were primarily responsible for the polio vaccine were of immigrant Jewish stock: Albert Sabin was born in Russia, Hilary Koprowski was from Poland, and Jonas Salk was born in America to Russian immigrant parents. Spanish-American biochemist Severo Ochoa won the 1959 Nobel Prize for his work on the synthesis of RNA. French-born Nobel laureate Andre Cournand came to the United States in 1930 and later helped develop cardiac catheterization, a technique that revolutionized treatment of heart disease.
Essentially, as a new pool of intellectuals began establishing laboratories in the U.S., the quality of biomedical science quickly vaulted to the top of the chain.
The Depression almost destroyed this progress when the endowments for many foundations supporting research crashed with the stock market. Thanks to the efforts of Louisiana Senator Joseph E. Randall, Alabama Senator (Joseph) Lister Hill and others, the U.S. government stepped in and began funneling money into the NIH to support basic scientists working on fundamental medical research, and to prevent America from losing its edge in this arena.
The aftermath of World War II solidly established the NIH as the major player in American biomedicine. For the first time, leaders realized that federal investments in basic research could actually shore up a struggling economy. Largely under the directorship of James A. Shannon, M.D., Ph.D., the NIH budget expanded from $8 million in 1947 to more than $1 billion in 1966. What’s more, those dollars had a direct impact on state revenues, because they were spread to 128 academic medical centers scattered across the country.
“Research dollars flow back into the local community with a five-fold multiplier effect,” notes Heidi Hamm, Ph.D., chair of Vanderbilt’s Department of Pharmacology.
For the next 35 years, various presidents and legislators continued to grow the NIH budget, convinced of the parallels between personal health and economic health, between innovative research and international leadership.
The hallmarks of American research have been its lack of hierarchy and its encouragement of independence. Grants submitted by young investigators are evaluated on a level playing field with grants submitted by senior scientists, even by Nobel laureates. Researchers are allowed to pursue their personal areas of interest, and even to change course if they so choose.
Also, they are not limited to government funding, as a number of private foundations, notably the Howard Hughes Medical Institute (HHMI), provide substantial support for basic biomedical research. The career trajectory of Harvard’s Douglas Melton, Ph.D., illustrates this distinction.
An HHMI investigator, Melton was studying the developmental biology of frogs, when his infant son was diagnosed with type 1 diabetes. Anxious to address the needs of his child, Melton turned his attention to stem cell research. He later approached the scientific review team at HHMI, explaining why and how he wanted to switch directions and study pancreatic beta cells. The reviewers examined his proposal and agreed to continue his funding. Melton is now one of the leading stem cell researchers in the world.
The advantage to this system, which is fairly unique worldwide, is that it’s a tremendous motivator to young scientists, says Steven McKnight, Ph.D., professor and chair of biochemistry at the University of Texas Southwestern Medical School. “If they succeed, they get all the credit for their discoveries,” McKnight says. “The flip side is that if they fail during four, six or eight years in the lab, they don’t get tenure and they lose their jobs. Job security depends upon success.”
Based on any number of measurements, this system of rewards has tended to breed excellence. Americans have surpassed their European and Asian counterparts in the number of breakthrough discoveries, an advantage that continued throughout the George H.W. Bush and Bill Clinton presidencies.
The trend of increased federal support came to an abrupt halt, however, during the George W. Bush administration, which funded the NIH at a flat level, amounting to a negative sum gain in research grants when factoring in the inflation rate.
Scientists complain that the flat NIH budget is squelching innovation and slowing progress. In the neurosciences, for example, the funding crunch has essentially squandered momentum that was blazing new inroads in the 1990s, the “decade of the brain,” asserts Randy Blakely, Ph.D., director of the Vanderbilt Center for Molecular Neuroscience.
During this period, “we saw the convergence of the disciplines of molecular biology, genetics, biochemistry, brain imaging, developmental biology and translational work,” Blakely says. “We got up to speed to really tackle most, if not all of the major brain diseases. Then we geared down.”
This has happened at a most inopportune time, he continues. The American population is aging, which automatically puts more people at risk for neurological diseases of the elderly, like Alzheimer’s and Parkinson’s. Also, soldiers returning from the wars in Iraq and Afghanistan with neurological problems and traumatic brain injuries are taxing the boundaries of current medical knowledge.
Inadequate funding of research ultimately is a drain on the economy, scientists insist.
“Biomedical science is a huge part of the economic engine of this country,” says Jeff Balser, M.D., Ph.D., dean of the Vanderbilt University School of Medicine and associate vice chancellor for Health Affairs. “Science is fueling start-up companies and industry because it’s discovered, published. It’s information available that those companies use to do their next thing.”
NIH is the powerhouse for the discoveries necessary for the development of new drugs. “If we lose our superiority in that area, it’s going to be one more area that we’re not exporting in,” Oates argues. “We’ll have to go back to making shoes and sending them to China.”
Jack Dixon, Ph.D., HHMI’s chief scientific officer and former dean for Scientific Affairs at the University of California, San Diego, agrees.
“I think it’s essential to be on the front-end of discovery,” Dixon says. “The molecular biology revolution, if you want to call it that, is based upon discoveries that happened in academic labs across the country. We have dozens and dozens of drugs and compounds on the market today that wouldn’t have happened if we hadn’t had those early discoveries taking place in the United States.”
This view is not universally shared. Slipping in science should not affect the nation’s bottom line, some observers argue, because we can simply exploit discoveries made by others.
According to Christopher Hill, Ph.D., professor of public policy and technology at George Mason University, companies as diverse as Google and Wal-Mart have become wealthy not by applying research conducted in the United States, but “by structuring human work and organizational practices in radical new ways.”
In an article published in the policy journal Issues in Science and Technology in March 2007, Hill maintained that the United States has already begun to move into what he calls a “post-scientific society,” one in which “the leading edge of innovation … whether for business, industrial, consumer, or public purposes, will move from the workshop, the laboratory, and the office to the studio, the think tank, the atelier, and cyberspace.”
Harvard economist Richard Freeman, Ph.D., agrees. He told The New York Times in April 2008 that Americans should worry less about their nation’s research status and more about “developing new ways of benefiting from scientific advances made in other countries.”
While it is true that, especially in this age of ubiquitous electronic communications, scientific advances made in one country are almost-instantaneously shared with the rest of the world, there still are strong arguments in favor of retaining a leadership position in research.
For one thing, a lively and well-supported scientific enterprise has characterized countries, dating back to ancient Greece, that have had a transformative impact on the world. “It’s not a prediction,” says Melton, “it’s sort of a fact, based on history, that if we don’t invest in science and education, we will become a second-rate country.”
For another, “homegrown” biomedical research can be more readily applied to solving the unique health challenges that emerge from the nation’s ethnic and genetic melting pot.
Strength in diversity
The recent revolution now driving our understanding of the genetic underpinnings of disease means that for the first time we may be able to solve some of the age-old riddles confronting portions of our population: why cystic fibrosis, for example, occurs more commonly among people of Northern European descent, why Pima Indians have such a high rate of diabetes and obesity, and why African-American women are more likely to die of breast cancer than their white counterparts.
Studies of diverse minority groups also can advance an entire field of inquiry. For example, the BRCA1 and BRCA2 cancer genes were identified by researchers trying to find out why Ashkenazi Jews had a higher risk of developing certain breast and ovarian cancers. By isolating the genetic, environmental and dietary variations among America’s ethnic subsets, scientists can better understand how certain diseases progress in the greater population.
“The idea that fundamental research can go on elsewhere, and then we can optimally translate it into our healthcare system I think is an illusion,” argues Clayton, who directs the Vanderbilt Center for Biomedical Ethics and Society.
“It’s like the way most people use a computer … They more or less can get stuff done, but because they don’t have a clue how it works, they don’t maximize the extent to which they use it.”
“When you have individuals who have been affected by these diseases in our educational pipeline, when you have people of color, Asian Americans, African Americans, Hispanics, etc., some of them are going to be interested in conducting research in these areas,” Hill argues. “When bright, inquisitive men and women from a wide variety of backgrounds attack these problems, we have a much better chance of getting to the answers that much faster.”
As powerful as these arguments may be, they may not persuade, given all of the other economic and security challenges facing the country. But wouldn’t it be interesting if the answers to our current flailing economy and inroads to world peace were found not in the halls of Congress or on the floors of Wall Street, but rather in the basic science laboratories of the United States?
“I’m hopeful that the collapse of the U.S. financial industry will make it clear to the leadership and to the American public that science and technology have been and must be the source of our future economic health,” says Bruce Alberts, Ph.D., professor of biochemistry and biophysics at UCSF and past president of the National Academy of Sciences (NAS).
Alberts and others realized that they could best serve the greater good by sharing the newfound information rather than hoarding it. And such has been the case. Releasing the human genome map to the world has launched new exploration into human genetics and into targeted therapies for treating individual patients.
“The scientific community may still be the only truly global community,” says Bishop, UCSF’s chancellor. “And one of the greatest satisfactions for me is being a part of that community – a community that is by and large unselfish, by and large sharing. One where we’re working towards a common purpose across nations, one in which we’re speaking the same language, not only linguistically, but in terms of ethos and ambition.”
Bishop concedes that his view is optimistic. “But (we) believe that since science is the strongest and most cohesive global community in the world, it can be the catalyst for international cooperation and peace among nations.”
View Related Article: