In two new studies, researchers from across the country spearheaded by Duke University faculty have begun to design the framework on which to build the emerging field of nanoinformatics.
Nanoinformatics is, as the name implies, the combination of nanoscale research and informatics. It attempts to determine which information is relevant to the field and then develop effective ways to collect, validate, store, share, analyze, model and apply that information — with the ultimate goal of helping scientists gain new insights into human health, the environment and more.
In the first paper, published recently in the Beilstein Journal of Nanotechnology, researchers begin the conversation of how to standardize the way nanotechnology data are curated.
Because the field is young and yet extremely diverse, data are collected and reported in different ways in different studies, making it difficult to compare apples to apples. Silver nanoparticles in a Florida swamp could behave entirely differently if studied in the Amazon River. And even if two studies are both looking at their effects in humans, slight variations like body temperature, blood pH levels or nanoparticles only a few nanometers larger can give different results. For future studies to combine multiple datasets to explore more complex questions, researchers must agree on what they need to know when curating nanomaterial data.
“We chose curation as the focus of this first paper because there are so many disparate efforts that are all over the road in terms of their missions, and the only thing they all have in common is that somehow they have to enter data into their resources,” says Christine Hendren, a research scientist at Duke and executive director of the Center for the Environmental Implications of NanoTechnology (CEINT). “So we chose that as the kernel of this effort to be as broad as possible in defining a baseline for the nanoinformatics community.”
The paper is the first in a series of six that will explore what people mean — their vocabulary, definitions, assumptions, research environments, etc. — when they talk about gathering data on nanomaterials in digital form. And to get everyone on the same page, the researchers are seeking input from all stakeholders, including those conducting basic research, studying environmental implications, harnessing nanomaterial properties for applications, developing products, and writing government regulations.
The daunting task is being undertaken by the Nanomaterial Data Curation Initiative (NDCI), a project of the National Cancer Informatics Nanotechnology Working Group (NCIP NanoWG) lead by a diverse team of nanomaterial data stakeholders. If successful, not only will these disparate interests be able to combine their data, the project will highlight what data are missing and help drive the research priorities of the field.
In the second paper, published in Science of The Total Environment, Hendren and her colleagues at CEINT propose a new, standardized way of studying the properties of nanomaterials.
“If we’re going to move the field forward, we have to be able to agree on what measurements are going to be useful, which systems they should be measured in and what data gets reported, so that we can make comparisons,” says Hendren.
The proposed strategy uses functional assays — relatively simple tests carried out in standardized, well-described environments — to measure nanomaterial behavior in actual systems.
For some time, the nanomaterial research community has been trying to use measured nanomaterial properties to predict outcomes. For example, what size and composition of a nanoparticle is most likely to cause cancer? The problem, argues Mark Wiesner, director of CEINT, is that this question is far too complex to answer.
“Environmental researchers use a parameter called biological oxygen demand to predict how much oxygen a body of water needs to support its ecosystem,” explains Wiesner. “What we’re basically trying to do with nanomaterials is the equivalent of trying to predict the oxygen level in a lake by taking an inventory of every living organism, mathematically map all of their living mechanisms and interactions, add up all of the oxygen each would take, and use that number as an estimate. But that’s obviously ridiculous and impossible. So instead, you take a jar of water, shake it up, see how much oxygen is taken and extrapolate that. Our functional assay paper is saying do that for nanomaterials.”
The paper makes suggestions as to what nanomaterials’ “jar of water” should be. It identifies what parameters should be noted when studying a specific environmental system, like digestive fluids or wastewater, so that they can be compared down the road.
It also suggests two meaningful processes for nanoparticles that should be measured by functional assays: attachment efficiency (does it stick to surfaces or not) and dissolution rate (does it release ions).
In describing how a nanoinformatics approach informs the implementation of a functional assay testing strategy, Hendren says. “We’re trying to anticipate what we want to ask the data down the road. If we’re banking all of this comparable data while doing our near-term research projects, we should eventually be able to support more mechanistic investigations to make predictions about how untested nanomaterials will behave in a given scenario.”
Both research papers were supported the National Science Foundation and the Environmental Protection Agency (DBI-1266252 and EF-0830093), and the paper on data curation was additionally supported by the National Institutes of Health (ES017552-01A2).
Release Date: August 18, 2015
Source: Duke University