Why do drugs cost so much to develop … and to buy?
The discovery of a potential drug target—a receptor involved in depression, for example—is just the first step. A pharmaceutical company may have to wade through several hundred thousand compounds just to find a few that act on the receptor.
The yield is tiny: only a few hundred will show sufficient activity to proceed with pre-clinical testing in cell cultures and animals. Of these, only a handful will meet the criteria for human testing:
They must be absorbed by the body and reach their target tissue at a just-enough concentration to do the job. Then they must be effectively eliminated from the body so they don’t reach toxic levels.
Five years of work may be required to get through this pre-clinical stage. Then it’s on to human testing, which is conducted in three “phases.”
In phase I, the compounds are tested for safety in up to 100 healthy volunteers. Phase II involves further safety and efficacy testing in 100 to 500 patient volunteers who have the condition the compounds are meant to treat.
In phase III, the potential drugs are given to thousands of patient volunteers to confirm effectiveness and appropriate dosage, and to detect adverse reactions.
Clinical development, from phase I through phase III, can take eight to 10 years to complete, and may cost $200 million to $350 million—for each of the candidates that enter clinical testing.
Yet, on average, only one of every five compounds tested in humans will satisfy the increasingly stringent requirements and become a new drug.
View Related Article:
Where are the new drugs?: The push to improve the pipeline