The space program in the mid-20th century accelerated the switch from analog to digital systems for high-speed data acquisition and monitoring.
But systems recording today’s physical and electrical phenomena must meet a new set of data acquisition and logging challenges, making them unrecognizable to those early computer pioneers. Many chart recorders and data-logging systems are now digital. Paper is gone; now digital processors, memory and communications link them to the ever-connected world. This close relationship to the computational power of the modern age draws a direct comparison between the technology growth in data acquisition instrumentation and the more familiar computer processor.
Four years before Neil Armstrong took his “giant leap” for mankind, Gordon Moore predicted another giant leap: Every two years, the number of transistors on semiconductor-based circuits would double. The exponential processor performance increases we enjoy as predicted by Moore’s law have affected most industries, enabling more powerful and complex systems. By harnessing these advancements, modern data-logging systems have more than kept pace with the devices they must test, showcasing innovation in both hardware and software technology.
Pushing the limits of logging systems
A variety of applications and industries have a need for more intelligence in their data-logging systems. Industries such as automotive, transportation and electric utility are already using high-performance data-logging systems.
Vehicles designed today require thousands of sensors and processors and extensive programming. With more intelligent vehicles come more parameters, both physical and electrical, to test and monitor. In addition, test engineers require intelligent and rugged logging systems for use within the vehicles they’re testing. For instance, engineers at Integrated Test & Measurement (ITM), Milford, Ohio, needed a high-performance and flexible in-vehicle test solution to determine the vibration levels of an on-highway vocational vehicle’s exhaust system during operation. They built a high-speed vibration logging solution that provided a wireless interface from a laptop or mobile device with National Instruments’ (Austin, Texas) standalone NI CompactDAQ system programmed with LabVIEW system design software. The high-performance 1.33-GHz dual-core Core i7 Intel processor within the standalone NI CompactDAQ system enabled advanced capabilities such as advanced signal processing, high-speed streaming at over 6 MB/sec to nonvolatile storage for all 28 simultaneously sampled accelerometer inputs, and Wi-Fi connectivity. In addition, with the latest version of the Data Dashboard for LabVIEW, engineers at ITM have the ability to build a custom user interface and directly interact and control their vibration logging system on an iPad.
Also pushing the limits of traditional data-logging systems is the electric utility industry. As the electrical grid changes, utilities are investing a lot of resources to make it smarter through the integration of more measurement systems and devices, such as power quality analyzers. A typical power quality analyzer acquires and analyzes three voltages of the power network to calculate voltage quality defined in international standards. Voltage quality is described by many parameters: frequency, voltage level variation, flicker, three-phase system unbalance, harmonic spectra, total harmonic distortion and signaling voltages level. With the amount of analysis and high-speed measurements required within this application, a traditional logging system can’t provide the needed horsepower. Engineers at ELCOM International, an electronics component manufacturer based in India, used NI’s LabVIEW and CompactRIO to create a flexible, high-performance power quality analyzer. Within this analyzer, the processor was used for tasks such as advanced floating-point processing, high-speed streaming to disk and network connectivity. CompactRIO’s embedded acquisition system featuring an embedded processor and an field-programmable gate array (FPGA) allowed for an additional processing unit within the system and performed custom I/O timing and synchronization and any high-speed digital processing needed within the application.
Next-generation data logging
Over the past decade, digital storage has increased exponentially while the corresponding cost has plummeted. As Moore’s law continues to be met with the appearance of more powerful, less expensive and smaller processors that use less energy, future data acquisition and logging systems will leverage this technology to grow more intelligent and feature-rich.
Data-logging systems have become more decentralized, with processing elements moving closer to the sensor and signal. Because of this change, remote DAQ systems and loggers are more integrated into the decision-making process. They no longer just collect data.
There are many examples of high-performance logging systems that integrate the latest silicon and IP from companies like ARM, Intel and Xilinx. A majority of the systems leverage a processor-only architecture while some systems incorporate a heterogeneous computing architecture that combines a processor with programmable logic.
More intelligence is important in many scenarios within data-logging applications. Within asset and structural monitoring applications, traditional data-logging systems log every data point to disk, even when nothing substantial with the physical phenomenon measured is happening. This results in megabytes and potentially gigabytes of data that need to be analyzed and sifted through offline. With more intelligence, systems can instead monitor a machine or structure continuously and quickly adapt when certain conditions are met. With more processing power, signal processing functionality can be embedded in the data-logging system itself on multicore processors or programmable logic. These more advanced systems can analyze the data inline and, in turn, deliver more meaningful results faster, which eliminates the need to transfer or store large amounts of data.
With data-logging systems featuring more intelligence and processing, the software they are running is a primary way for vendors to differentiate themselves. Traditional data-logging software consists of turnkey tools that engineers use to configure the system and get to the measurements quickly. The downside of turnkey tools is that they tend to be less flexible. At the other end of the spectrum, engineers and scientists can take advantage of a text-based programming tool or a graphical programming approach to program the processors within these systems. Programming tools offer the most customization for these data-logging systems, including a wider range of signal processing and the ability to embed any type of intelligence. But they have a steeper learning curve compared to turnkey tools.