New Computational Tool Harnesses Big Data, Deep Learning to Reveal Dark Matter of the Transcriptome
A research team at Children’s Hospital of Philadelphia (CHOP) has developed an innovative computational tool offering researchers an efficient method for detecting the different ways RNA is pieced together (spliced) when copied from DNA. Because variations in how RNA is spliced play crucial roles in many diseases, this new analytical tool will provide greater capabilities for discovering…
Open Source Software Helps Researchers Extract Key Insights From Huge Sensor Datasets
Professor Andreas Schütze and his team of experts in measurement and sensor technology at Saarland University have released a free data processing tool called simply Dave—a MATLAB toolbox that allows rapid evaluation of signals, pattern recognition and data visualization when processing huge datasets. The free software enables very large volumes of data, such as those…
NCSA Reveals Promising Diagnostics for Detecting Latent Tuberculosis
Small Babies, Big Data
The first week of a newborn’s life is a time of rapid biological change as the baby adapts to living outside the womb, suddenly exposed to new bacteria and viruses. Yet surprisingly little is known about these early changes. An international research study co-led by Boston Children’s Hospital has pioneered a technique to get huge…
New Method of Scoring Protein Interactions Mines Large Data Sets From a Fresh Angle
Researchers from the Stowers Institute for Medical Research have created a novel way to define individual protein associations in a quick, efficient and informative way. These findings, published in the March 8, 2019, issue of Nature Communications, show how the topological scoring (TopS) algorithm, created by Stowers researchers, can—by combining data sets—identify proteins that come together. The approach is…
TACC Assists in Massive Data Collection Effort in Lung Development to Help Premature Babies
In 2016, over a dozen scientists and engineers toured a neonatal intensive care unit, the section of the hospital that specializes in the care of ill or premature newborn infants. The researchers had come together from all around the country, and brought with them a wide variety of expertise. Visiting the newborns helped put into…
Big Data Harvesting Tool Will Deliver Smart Farming
Researchers from across Norwich Research Park have launched a new system for organising vast datasets on climate and crops. CropSight is a scalable and open-source information management system that can be used to maintain and collate important crop performance and microclimate information. Big data captured by diverse technologies known collectively as the Internet of Things…
Chemical Data Mining Boosts Search for New Organic Semiconductors
Producing traditional solar cells made of silicon is very energy intensive. On top of that, they are rigid and brittle. Organic semiconductor materials, on the other hand, are flexible and lightweight. They would be a promising alternative, if only their efficiency and stability were on par with traditional cells. Together with his team, Karsten Reuter,…
Supercomputing Effort Reveals Antibody Secrets
Using sophisticated gene sequencing and computing techniques, researchers at Vanderbilt University Medical Center (VUMC) and the San Diego Supercomputer Center have achieved a first-of-its-kind glimpse into how the body’s immune system gears up to fight off infection. Their findings, published this week in the journal Nature, could aid development of “rational vaccine design,” as well as…
AI and Big Data Provide the First Global Maps on Key Vegetation Traits
Citizen Science Projects Have a Surprising New Partner — the Computer
For more than a decade, citizen science projects have helped researchers use the power of thousands of volunteers who help sort through datasets that are too large for a small research team. Previously, this data generally couldn’t be processed by computers because the work required skills that only humans could accomplish. Now, computer machine learning…
Modeling Uncertain Terrain With Supercomputers
Many areas of science and engineering try to predict how an object will respond to a stimulus—how earthquakes propagate through the Earth or how a tumor will respond to treatment. This is difficult even when you know exactly what the object is made of, but how about when the object’s structure is unknown? The class…
Researchers Call for Big Data Infrastructure to Support Future of Personalized Medicine
Researcher Wins Machine-Learning Competition With Code That Sorts Through Simulated Telescope Data
A new telescope will take a sequence of hi-res snapshots with the world’s largest digital camera, covering the entire visible night sky every few days—and repeating the process for an entire decade. That presents a big data challenge: What’s the best way to rapidly and automatically identify and categorize all of the stars, galaxies, and…
New Technology for Machine Translation Now Available
Deep Learning Software Speeds Up Drug Discovery
Next Generation Photonic Memory Devices Are Light-written, Ultrafast and Energy Efficient
Light is the most energy-efficient way of moving information. Yet, light shows one big limitation: it is difficult to store. As a matter of fact, data centers rely primarily on magnetic hard drives. However, in these hard drives, information is transferred at an energy cost that is nowadays exploding. Researchers of the Institute of Photonic…
NCSA Brings Dark Energy Survey Data to Science Community into 2021
After scanning in depth about a quarter of the southern skies for six years and cataloguing hundreds of millions of distant galaxies, the Dark Energy Survey (DES) finished taking data January 9, 2019. The National Center for Supercomputing Applications (NCSA) at the University of Illinois will continue refining and serving this data for use by scientists into 2021. The…
Creating a ‘Virtual Seismologist’
Understanding earthquakes is a challenging problem—not only because they are potentially dangerous but also because they are complicated phenomena that are difficult to study. Interpreting the massive, often convoluted data sets that are recorded by earthquake monitoring networks is a herculean task for seismologists, but the effort involved in producing accurate analyses could significantly improve…
Lilly Expands Deal to Analyze Patient Data From Smartphones and Connected Sensors
Big Data Used to Predict the Future
AI Capable of Outlining in a Single Chart Information From Thousands of Scientific Papers
NIMS and the Toyota Technological Institute at Chicago have jointly developed a Computer-Aided Material Design (CAMaD) system capable of extracting information related to fabrication processes and material structures and properties–factors vital to material design–and organizing and visualizing the relationship between them. The use of this system enables information from thousands of scientific and technical articles…