AS WE ALL KNOW, pharmaceutical drug development is both risky and rewarding. Part of its inherent risk could probably be hedged with considerable therapeutic benefit (and to the joy of the stockholders). That would require a rational approach to the engineering of the molecular agent. Common sense -the least common of the senses- tells us that a compound with the right target affinity and a tightly controlled selectivity should be better equipped to withstand the long-term attrition along the drug discovery pipeline. Yet, in spite of claims to the contrary, drug discovery as we know it today can hardly be called rational. High-throughput screening and serendipity clearly dominate the scene and have the upper hand when it comes to identifying promising chemical leads. This may well be the culprit for the state of disarray that plagues the pharmaceutical industry. Just think about those go/no go decisions made after hundreds of millions -if not billions- of dollars spent on the evaluation of drug candidates, or the “patent cliffs” that often throw companies into a state of panic and trigger painful “corrective decisions”. There are of course compelling justifications to place chance and randomness at the methodological forefront: humans are highly complex multiscale systems with insufficient or inadequate annotation and unfathomable side effects with life-threatening consequences can and do occur.
This state of affairs -on which we hopefully all agree- begs the question: Assuming the lack of systems level annotation on humans could be overcome to validate new targets in some disease-related contexts, what hurdles still remain that prevent drug design from being rational? The answer is simple yet painful: Our lack of understanding of the biophysical fundamentals that govern drug-target associations and their specificity. To add substance to this discussion let us focus on water-soluble proteins as drug targets, and assume that the therapeutic benefit arises from a selective inhibition of a biological function carried out by the targeted protein. In this context, hard as it is to admit, we cannot come to terms with our poor understanding of the physics of protein-ligand interactions. This is so mainly because we stubbornly tend to ignore the fact that we are not actually targeting proteins but the interfaces they make with water, or more generally with the aqueous medium. Thus, if it is ever going to be rational, drug design should purposely target the protein epistructure rather than the protein structure. When properly understood, the interfacial features defining the epistructure can steer new and successful molecular engineering. A case in point is the concept of dehydron. A dehydron is a structural deficiency in the form of a solvent-exposed backbone hydrogen bond that generates epistructural (interfacial) tension. This tension, in turn, can be relieved with a purposely engineered drug/ligand that displaces interfacial water upon binding to the target protein. Furthermore, since the dehydron pattern is even more unique than the protein structure, targeting dehydrons with the so-called “wrapping technology” ensures an unprecedented level of control of drug specificity.
Understanding what needs to be targeted will surely reinforce the chemical link in the drug-discovery pipeline but is not likely to get the industry out of the usual quagmires. Solving the fundamental problem of identifying the molecular feature worth targeting brings about another problem: How do we expand and parse chemical space to generate leads capable of optimally interacting with the intended molecular features? This problem has been briefly addressed by the author in this lecture.
The generation of suitable chemical space from which to draw therapeutically relevant designs has marred success more frequently than we care to admit. Early on, researchers turned to scaffolds optimized by mother Nature after billions of years of tinkering with evolution. These natural products were abruptly deemed obsolete when the paradigm of combinatorial chemistry gripped the industry. Be as it may, combinatorial chemistry did not live up to its promises. This should not come as a surprise: it does seem wise to throw away billions of years of evolutionary experimentation together with the vast empirical knowledge thus accrued and embossed in the natural designs. To enrich chemical space and partly to recreate the unfathomable diversity of natural-product solutions, researchers are now inclined to exploit fragment-based lead discovery. Within this paradigm, chemical diversity can be achieved by fragment recombination with synthesized covalent linkages. A major hurdle to make successful drugs this way arises since fragment-based designs cannot be fine-tuned to the level of sophistication required to fulfill the stereospecificity of biological material.
Now that we know what molecular features to target, a new paradigm is badly needed in the chemical arena to generate scaffolds that fulfill the new targeting roles. Yet, the novel chemical space must enable the emerging drugs to effectively meet the stereospecificity demands of the partnering biological molecules. This may prove quite a daunting task but certainly one worth pursuing, especially if the pharmaceutical industry ever feels the need to be rescued by fundamental science.
Review on the wrapping technology in drug design by Harvard Professor George Demetri
Critical Survey on Wrapping Technology in Nature Reviews Drug Discovery
Ariel Fernandez: “Transformative Concepts for Drug Design: Target Wrapping”. Springer-Verlag, Heidelberg, Berlin (2010) ISBN: 978-3-642-11791-6