Image analysis is a tried-and-true prerequisite for scientific publications.
Image analysis is of growing importance in science, and trends are observed for different layers of image acquisition. Quantifiable and reproducible data is a prerequisite for scientific publications. And, today, it isn’t sufficient to just acquire aesthetically pleasing images with a microscope. To get powerful scientific results, scientists must get as much information as they can from an image. To gain this information, software-integrated analysis tools can help scientists retrieve the desired information fast and conveniently.
However, image analysis can be much more. It can also improve the quality of an image and data. Image restoration by deconvolution, for example, improves the signal-to-noise ratio of an image, as well as the resolution.
With technological advances in sensitivity of camera sensors and confocal detectors, larger data sets contain a wealth of information that researchers want to extract. The capture rate with these advances helps keep pace with biological research. Microscopy systems also use real-time feedback from the biological specimens to determine an experiment’s course, such as the stage positioning and/ or change in temporal rate.
“Image analysis must follow suit and track thousands of objects, objects that come in and out of the field of view, or move with the objects,” says Laura Sysko, software product manage, Nikon Instruments. “It must also intelligently process changes in shape, intensity and deal with cells that split into generations of daughter cells.”
In correlation with the improvements in acquisition devices, the flow of data to analyze becomes more important and precise. Image processing is a key element for analyzing this data and is required to run fast, and deal with large data—from 3-D data sets, stitched 2-D images, timer series or multi-channel images.
“Another consequence of this is the need to compare the results of analyses coming from different modalities, such as electron microscopy, computed tomography, optical microscopy and more,” says Pascal Doux, software product manager, FEI. Some efficient techniques are now required to run multi-modality registrations, extract information from these modalities and make the data fusion.
In order to manage this increased data, it becomes critical to implement processing solutions that support multicore CPU parallelization. “Dedicated GPU (graphical processing unit) computation also brings a great performance gain,” says Doux.
Another trend is the automation of large-scale experiments. “Image analysis carried out simultaneously with acquisition can increase throughput and data quality,” says Constantin Kappel, product manager for high content screening, Leica Microsystems. High-content screening implemented into microscopy software standardizes biological applications for rapid and reproducible results with high-quality statistics. Automation also helps minimize the amount of required user interactions, without sacrificing transparency and flexibility.
Enhancements for reproducible science
Life science has evolved from a descriptive to a quantifiable discipline. As such, documentation and quantitative description of biophysical processes is becoming more important. Image analysis helps achieve reproducible and reliable results. With this trend, data is increasingly comparable, while getting more complex. And this promotes multidisciplinary cooperation between life scientists, physicists and computer scientists.
The performance of image analysis has improved over the past few years. Commercial and free-source image analysis offerings provide a more user-friendly interface to approaching image analysis. Today, image analysis is approachable and accessible in most common microscopy software. And there are now many options for novices to analyze their single images, timelapse data sets and 3-D volume over time data.
“Companies and freeware sources offer canned pre-made analysis routines to help researchers have a starting point for analysis. However, a more popular approach is the toolbox offering,” says Sysko.
This environment consolidates many commonly used image processing tools—such as contrast and detection enhancement tasks, multiple binary mask generation, logical and/or arithmetic operations and measurement. According to Sysko, the consolidated toolbox option, such as Nikon’s NIS-Elements General Analysis module, allows for building custom and creative routines researchers can preview on their data and then batch across larger data sets or a high-throughput project.
“These analyses can also be tied into the acquisition itself by adding in conditional decision-tree events to determine the length and sequence of the imaging experiment,” says Sysko.
Leica Microsystems also has implemented a workflow-oriented procedure that makes image analysis available for non-experienced users. According to Petra Haas, product manager for LAS X confocal software, Leica Microsystems, versatile analysis wizards within their LAS X guide the user step-by-step through different tools from applying filters, thresholding and binary image processing to measurements and classification. Open software interfaces also allow use of the data in third-party analysis tools like Huygens Deconvolution software, Image J, Fiji, CellProfiler and KNIME.
“The integration of Leica HCS A (High Content Screening Automation) into Leica Microsystems’ confocal and widefield systems supports the screening of a large number of samples under various conditions for robust statistics,” says Kappel. Immediate image analysis eliminates the need to manually sift through a large number of specimens, where rare events, such as observing a dividing cell among others, can easily be missed.
“Computer Aided Microscopy (CAM) allows these events to continuously stream to external storage devices where they are analyzed simultaneously during image acquisition,” says Kappel. In conjunction with CAM, Leica HCS A can respond to feedback from the analysis software about an event detected during acquisition.
High-content screening is a growing discipline in life science research, as the number of complex experiments is increasing, and statistically relevant data is needed to further scientific discoveries. “Research is automated by means of large-scale screening experiments,” says Kappel. “This allows a higher throughput of experiments and higher data quality.”
Automated analysis during acquisition saves time, since scientists don’t need to manually sift through data sets and rare events, and less things are missed. This also saves on lab equipment costs as experiments don’t need to be repeated as often.
Some techniques based on statistical data analysis have also appeared in recent years. These approaches bring more efficient solutions to image processing traditional issues like classification, pattern matching or tracking. “Super pixel algorithms allow you to simplify images with understandable region portioning, giving access to well-known fast-graph computations,” says Doux. “Some descriptors like SIFT or FREAK bring some help for extracting relevant information independently from the scale and orientation of the features.”
However, in terms of technology, some GPUs offer a large amount of memory that can be used for processing. And, CPU parallelization is commonly used in image processing algorithms, saving substantial processing time with most computers.
All these enhancements help to manage the increase of data and modalities in terms of performance and data correlation. Better statistical data analysis helps improve image segmentation and may lead to improved research of complex structures, especially in the life sciences field. “Thus, a better understanding of challenging issues can be done, for example, segmenting textured phases in materials science or detecting specific cells inside a complex life science sample,” says Doux.
Further enhancing image acquisition
To more accurately model the true physiological state, therapeutic and disease progression research is shifting towards an in vivo model in whole organisms. Pre-clinical trial and drug safety studies are also incorporating in vivo studies earlier in the process to evaluate toxicity and efficacy.
“Multiphoton systems, such as Nikon’s A1R MP+ confocal microscope, are challenged with motion artifacts when penetrating deep into tissue,” says Sysko. “When an organism shifts, breathes, muscles contract, blood flow or slow drift occurs, real-time image analysis must incorporate and calculate these sources of motion.”
Along with the goal of imaging normal physiological states, label-free techniques are developing and becoming more prevalent, and reliable brightfield image analysis techniques are in demand to analyze data sets without the benefits of high contrast fluorescence.
While some solutions have been made to support the processing of image data, the biggest challenge of image analysis is still preparing for what Sebastian Rhodes, Carl Zeiss, calls the “data avalanche”.
“It all started with high-speed and high-resolution imaging systems,” says Rhodes. “Huge amounts of image data can nowadays be acquired easily within a single experiment. And that will lead to even more data after the image analysis.”
These data and images must be put into the correct context for researchers to find the answers they are looking for. Leaving back-traceability of derived data from image analysis inside big data sets as an enhancement is needed.
There’s also a growing need for personalized image analysis. “The microscopy software needs to have open interfaces to third-party and open source elements, so that every researcher can define their project and the information in the image which is relevant to their scientific question,” says Haas. Further enhancements can be expected for the seamless integration of third-party and open software tools into microscopy software.
Image analysis in the future
The potential for image analysis is huge, and its importance will continue to grow. But for researchers to not get lost in all the possibilities to analyze images, it will become crucial to decide which analysis algorithm is the correct one with respect to the scientific question to be answered.
“If there are no standards or guidelines put into place, tons of ‘data garbage’ will be produced easily considering the available computation power: Just test enough different ways to analyze your image data until you get what you would like to see,” says Rhodes. If vendors can enhance and expand the capabilities of image analysis while providing guidelines and standards for the scientific community, the future will be bright.
Overall, image analysis is widely used and, with its convenient access to researchers, it has become indispensable for quantitative, quantifiable and reliable results. The future possibilities for image analysis lie in setting up predictive models of biophysical processes and development.