It is fashionable to start many articles concerning drug discovery with a statement such as “Presently it takes 10 years and three to 10 billion dollars to find and validate a new drug.” At that point I say “Stop! What is wrong with this picture?” The true picture is even bleaker as, in a recent Internet piece, an alternate metric told us that ”For every $1 billion invested in drug development, the number of drugs approved has halved every year since the 1950s; and approximately 70 percent of all the prescription drugs sold in the U.S. today are generics.” In the last several decades, huge increments in investment have been made with ever decreasing output. Presently, only four percent of drugs in any company’s pipeline make it to market!
I have long held the view that we produce few drugs at a tremendous investment of time and money, as we have no idea as to what goes into a working, non-toxic drug. The present view seems to be that “all of the low hanging fruit has been picked,” and we must now contend with complex diseases caused by a variety of genes with differing triggers, as well as very heterogeneous cellular populations.
The emphasis these days seems to be on genomics and immunology, but these are but a small part of what goes on in diseases, such as cancer, and can only address two or three parts of what constitutes a working drug. Thus, we grimly accept the aforementioned dictum (I call it the 10/10 fallacy) and slog away for long periods doing the safest science possible. Large companies don’t have a scientific solution to this, so they result to short- and medium-term business methods, i.e., buy a well-selling drug (or the whole company) from a competitor, or just repurpose an old drug. We know that we have mountains of data on a variety of databases describing many interrelated biological processes. We must now stitch these together and make sense of the results. We must also acquire expertise in the experimental design of complex experiments, as well as master the techniques necessary to perform the experiments. Clearly, our model of the drug development process must be reevaluated and changed.
It should be recognized at this point that we are dealing with many, AND NOT JUST A FEW, biological systems and pathways and must address this complexity with new and appropriate management systems. This will also require a change in the mindset of the research scientist, as they will have to interface across a wide variety of colleagues while focusing on their own areas of expertise. The following proposal specifically addresses my own research in cancer molecular biology, but may be generalized to all drug development.
It is commonplace for two (occasionally three) laboratories to collaborate on specific research projects. I recognize the need for many more laboratories to collaborate, as the number of systems and pathways involved in most cancers is large and complex. Each laboratory may have some expertise in a number of special techniques, but outstanding knowledge and expertise in only one to three. Thus, the following is proposed:
- At least 20 (and probably more) laboratories must be selected for their known expertise in both specific techniques and pathways. Labs may be self-selected and included given their qualifications. Collaborators may be selected across not only academe, but industry and government as well. A specific disease to work against is selected in the next step.
- A roadmap (such as the abbreviated outline below) must first be created, discussed and augmented by all parties according to their expertise to be used as the guide for the overall project. The roadmap will be used by the project administrator to time work in each unit and ensure timely and complete communications among the appropriate parties. Each laboratory will then finalize its contribution(s) as to techniques and pathways. When timelines are estimated (realizing that, in drug discovery, they must be fluid, to a point) and participants agree on their roles and the scheme, work may begin. In some areas, the work may begin immediately; in others, target fragments must be generated and pathways elucidated as fully as possible before the work begins. In each case, it will be relied on that the individual laboratories will be best able to handle problems in their own area but may consult with others as needed. A valuable benefit of all these specialized laboratories is that each will be responsible for evaluating the mountain of research papers in their own areas, especially many highly obscure tracts in computational biology.
- When the preliminary work is completed and the appropriate knowledge gained is applied to generating the proper conditions (for molecules, fragments, genes, pathways and systems) and especially inter-correlations amongst these, the proposed “treatments” may be tested in several in vitro systems, then a small number of rodents or any animal known to have responses to the given disease similar to humans tested, and finally, human volunteers (that is why so much attention is given to toxicology up-front, and after every tweak).
What follows is but the briefest of outlines and will be enhanced and refined throughout the project.
Epigenetic effects fragment generation/characterization –> gene expression studies in target –> data corrections to allow for unobserved variables –> effects of previously undiscovered genes and “apparently” non-coding RNA’s –> effects of miRNAs & lncRNAs –> copy number associations with gene expression –> data reduction through statistical methods –> further gene analysis to identify driver/controller genes –> recalculations –> extensive review and definition of pathway and systems with emphasis on structures and intercorrelations –> predictive modeling of complex systems –> crosstalk between signaling pathways –> biological feedback to effects of the proto-drug –> evaluation of efficacy
It is apparent that this is only the briefest summary of what is necessary, but certainly not sufficient. Also, the order of the steps needs to be examined and modified by input from the appropriate laboratories. Even more pertinent is the insertion of toxicology studies at several points in the map, the first insertion to be as early in the process as possible.
The notion of massive collaborations across laboratories and industries, with government included, is not new. The massive efforts developing the atomic bomb during WW II effectively demonstrated what could be done by a large number of dedicated and brilliant individuals. It badly needs to be done again.
Now, however, it needs to be done to attack a massively complex problem that is far more complicated than any “physical” system. The question now is, are we ready to do this?
John Wass is a statistician based in Chicago, IL. He may be reached at editor@ScientificComputing.com.