Life scientists are increasingly reliant on advanced computation to advance their research. Two very prominent examples of this trend will be presented this summer at the ISC High Performance conference, which will feature a five-day technical program focusing on HPC technologies and their application in scientific fields, as well as their adoption in commercial environments. On Tuesday, July 14, Manuel Peitsch, VP of Biolologic Systems Research at Philip Morris International and Chairman of the Executive Board at The Swiss Institute of Bioinformatics, will chair a session on computational methods used to understand health and disease, while on Wednesday, July 15, Juelich Supercomputing Centre director Thomas Lippert will direct a session on the various work being done under the EU’s Human Brain Project (HPB). In this article, Manuel Peitsch and Thomas Lippert provide exclusive previews of what each of these sessions will cover.
SUPERCOMPUTING AND THE HUMAN BRAIN PROJECT — A 10-YEAR QUEST
Thomas Lippert, Ph.D.
The goal of the FET Flagship Human Brain Project (HBP), funded by the European Commission, is to develop and leverage information and communication technology fostering a global collabora- tion on understanding the human brain by using models and data-intensive simulations on supercomputers. These models offer the prospect of a new path for the understanding of the human brain, but also of completely new computing and robotics technologies.
Given the opportunity to take advantage of the ISC High Performance conference (ISC 2015) as a discussion platform that will accompany the HBP during its project duration, this year’s HBP session aims at reporting on the first 18 months of this global collaborative effort and at highlighting some underlying technologies, such as the Web-accessible portal and the information and technology platforms, as well as aspects of the neural networks simulations employed by the project and beyond. We especially look forward to focusing on the design of the high performance computing platform, which will make a major contribution to further improve the simulation of neural networks.
Launched in October 2013, the HBP is one of the two European flagship projects foreseen to run for a 10-year period. It aims at creating a European neuroscience-driven infrastructure for simulation and big-data-aided modeling and research. The HBP research infrastructure will be based on a federation of supercomputers contributing to specific requirements of neuroscience in a complementary manner. It will encompass a variety of simulation services created by the HBP, ranging from simulations on a molecular level towards the synaptic and neuronal level, up to cognitive and robotics models. The services and infrastructures of the HBP will successively include more European partners and will be made available step-by-step to the pan-European neuroscience community.
Based on this, this year’s HBP session is divided thematically into three sections: the HPC platform, simulation activities and neuromorphic computing approaches. It will be complemented by an external speaker from the new Cortical Learning Center (CLC) at IBM Research on approaches that are being pursued outside the project.
One of the talks is dedicated to the collaboratory, service-oriented Web portal serving as a central unified access point to the six HBP platforms (neuroinformatics, brain simulation, HPC, medical informatics, neuromorphic and neurorobotics platforms) for the researchers inside the flagship project. Another talk will highlight the UNICORE middleware, which plays an important role in the HPC platform as an interface technology, creating a uniform view of compute and data resources. The design philosophy and the implementation details used both in the creation of the HBP’s HPC platform and for the integration of the platform components will also be addressed.
Brain simulation is a main objective of the HBP. The project aims to reconstruct and simulate biologically detailed multi- level models of the brain displaying emergent structures and behaviors at different levels appropriate to the state of current knowledge and data, the computational power available for simulation, and the demands of researchers. A specific talk will focus on two aspects of brain simulation, the model space of relevant neurosimulations, and how this may lead to different computational requirements.
Existing neuroscientific simulation codes are large scalable software packages that are used successfully by a vast community. The inclusion of advanced numerical methods to simulate effects beyond the scope of the currently used methods requires the design of minimally invasive methods that require only small changes in the existing code and that use the present communication routines to the highest amount possi- ble. The inclusion of advanced numerical schemes for the integration of systems of ordinary differential equations in the NEST software package without harming the scalability will also be presented.
With respect to neuromorphic computing approaches, the session will highlight two large-scale complementary neuromorphic computing systems. The status of construction and commissioning of both phase-1 systems is summarized, and an overview of the roadmap until 2023 will be given.
The HBP session will end with an over- view of the new IBM CLC, its motivation, and technology, which is based on the Hierarchical Temporal Memory cortical model. Early applications and hardware plans will also be discussed.
• Session: Supercomputing & Human Brain Project – Following Brain Research & ICT on 10-Year Quest | Wednesday, July 15, 2015, 8:00 a.m. Chaired by Thomas Lippert http://bit.ly/1JQg7c8
UNDERSTANDING HEALTH AND DISEASE
Manuel Peitsch, Ph.D.
The session on “Computational Approaches to Understanding Health and Disease” at the ISC High Performance 2015 conference will address computational aspects of systems biology. The session will also provide a practical example of the application of systems biology in the field of toxicology and show how experimental approaches can be integrated with sophisticated computational methods to evaluate the toxicological profile of a potentially reduced-risk alternative to cigarettes.
Systems biology is a highly multidisciplinary approach that considers the biological systems as a whole and combines large-scale molecular measurements with advanced computational methods. It aims to create knowledge of the dynamic inter- actions between the many diverse molecular and cellular entities of a biological system and of how interfering with these interactions, for instance with genome mutations, drugs or other chemicals, leads to adverse reactions and disease. This knowledge is captured in richly annotated biological network models, which are rep- resented as graphs of nodes (entities) and edges (relationships).
Systems biology will, thus, elucidate the fundamental rules and, ultimately, the laws that describe and explain the emergent properties of biological systems — in other words, life itself. Like rules and laws in physics, biological network mod- els may well be the description of complex systems that will enable our under- standing of therapeutic interventions and how they influence disease progression.
The application of systems biology, therefore, leads to new approaches in pharmacology, toxicology and diagnostics, as well as drug development. It promises to pave the way for the development of a more precise and personalized approach to medicine, both from a therapeutic and a prevention perspective.
To reach these ambitious goals, the first objective of systems biology is to elucidate the relevant biological networks and to understand how perturbing their normal function leads to disease. The second objective is to develop mathematical models with predictive power. Towards this end, both experimental and computational methods are needed, and employing them has led to great progress over the last decade.
Firstly, the identification of the entities in a network and the elucidation of their interactions — or their connectivity — requires highly accurate large-scale molecular measurement (‘omics’) methods, which permit the quantification of most components of a biological system in relation with physiological observations gathered at the cell, tissue and organ level. Consequently, key measurement technologies have been developed which enable the systematic and comprehensive interrogation of a biological system by producing large sets of quantitative data of all major classes of biological molecules including proteins, gene expression and metabolites.
Secondly, the complexity, diversity and shear amount of data produced by these ‘omics’ methods require sophisticated HPC-enabled data analysis methods that yield the knowledge to build and refine the biological network models. These representations require not only a wealth of quantitative and dynamic data about the entities and their interactions, but also dictionaries describing their entities, a specialized computer language to capture the nature of their interactions, and mathematical descriptions of their dynamic behaviors. Clearly, integrated approaches HPC-enabled big data analysis still represent a challenge, and much work is needed to develop more efficient methods. Nevertheless, the combination of recent developments in these areas are yielding novel computational methods that allow quantification of how much healthy biological networks are perturbed by active substances, such as toxic chemicals. Furthermore, we are seeing the emergence of mathematical models that are aimed at predicting the effect of substances on skin sensitization, drug-induced liver injuries and embryonic development.
The past decade has witnessed massive developments in measurement technologies, as well as in HPC and scientific computing. These developments will continue, providing systems biology with increasingly accurate quantitative molecular and physiological data, biological networks and predictive models of biological mechanisms involved in human diseases. With the gradual development of reliable and validated methods for molecular measurements in the clinical setting, systems biology will likely enable the enhancement of such dis- ease models with personal molecular data and, hence, is likely to play a major role in personalized medicine.
Indeed, the “personalization” of dis- ease-relevant biological network models would enable the selection of the most effective drug, or combination of drugs, for the treatment of certain diseases. Therefore, by placing the mechanistic understanding of adverse effects and disease at the center of drug discovery, medicine and toxicology, systems biology plays a major role in driving the paradigm shift in biomedical sciences from “correla- tion” to “causation.”
For instance, the development of novel molecular diagnostics increasingly relies on systems biology. Not only does it enable the identification of molecules associated with disease to unprecedented levels of comprehensiveness, but it also drives the selection of the most diagnostic combination of molecules to be measured. This selection process will increasingly be based on the understanding of disease-causing mechanisms.
In systems toxicology, the quantitative analysis of large networks of molecular and physiological changes enables the toxicological evaluation of substances and products with unprecedented precision. For example, Philip Morris International is developing a range of potentially reduced-risk alternatives to cigarettes. To conduct the non-clinical assessment of these potentially reduced-risk products and determine whether they are indeed lower- ing the development of disease, we have implemented a systems toxicology-based approach. HPC enables our data analysis processes and building of biological net- work models, which are used to compare mechanism-by-mechanism the biological impact of novel product candidates with that of conventional cigarettes.
While systems biology is starting to yield practical applications in diagnostics, drug discovery and toxicology, much research is needed to leverage this approach fully and to increase its range of applications. The computational enablers that need most attention are the development of computational approaches to big data analysis and the development of HPC systems that are optimized for the complexity of long-term multi-level simulations. While still in their infancy, multi-level mathematical models of organs, such as the heart and brain, hold the promise to enable future drug discovery. Their development, however, requires unprecedented efforts, not only in terms of experimental data collection, but also in computer science and scientific computing. It is clearly through the collaboration between computer scientists, scientific computing specialists and biologists that such developments will come to fruition.
• Session: Computational Approaches to Understanding Health & Disease | Wednesday, July 14, 2015, 1:45 p.m. Chaired by Manuel C. Peitsch http://bit.ly/1MFydLP
For more information about ISC 2015, visit http://www.scientificcomputing.com/ISC.