Riding the Wave of Technological Change
The gap between stand-alone instruments and PCs is closing quickly
|Virtual instrumentation in the late 1980s – Over the last two decades, the pace of innovation in computing size, speed, and power have made tremendous advancements. Virtual instrumentation began with the fundamental concept of connecting instruments to standard computers to leverage the power, flexibility, and display capabilities of the PC.|
Rapid adoption of the PC during the past 20 years has given rise to a new way for scientists to measure and automate the world around them — virtual instrumentation. Today, virtual instrumentation is coming of age with engineers and scientists using “virtual instruments” in literally hundreds of thousands of applications around the globe, resulting in faster application development, higher quality products, and lower costs.
A virtual instrument consists of an industry-standard computer or workstation equipped with powerful off-the-shelf application software, cost-effective hardware such as plug-in boards, and driver software — which together perform the functions of traditional instruments. These plug-in boards are used for digital communication, analog-to-digital conversion, digital-to-analog conversion and digital I/0. The PC and integrated circuit technology have experienced significant advances in the last two decades, but it is software that makes it possible for users to create their own virtual instruments, providing better ways to innovate and significantly reduce cost. With virtual instruments, engineers and scientists are no longer limited by traditional fixed-function instruments, but can build measurement and automation systems that suit their needs exactly.
|LabVIEW circa 1986 – With the introduction of powerful software applications, such as LabVIEW 1.0 in 1986, hardware-centric instrumentation systems could now be exploited to create customer-defined measurement and automation systems.|
Moving to a more flexible, cost-effective system
Virtual instruments represent a fundamental shift from traditional hardware-centered instrumentation systems to software-centered systems that exploit the computing power, productivity, display, and connectivity capabilities of popular desktop computers. In the last two decades, PC performance has improved by a factor of 10,000 while their prices have decreased dramatically, making them powerful, reliable, and cost-effective. These advancements not only make virtual instrumentation a low-cost and flexible solution, but also deliver productivity gains unmatched by stand-alone, proprietary systems.
Traditional instruments such as stand-alone oscilloscopes and waveform generators are powerful, yet expensive, and designed to perform a fixed set of specific tasks defined by the vendor. The user generally cannot extend or customize them. The knobs and buttons on the instrument, the built-in circuitry, and the functions available to the user are specific to the nature of the instrument. In addition, special technology and costly components must be developed to build these instruments, making them very expensive and slow to adapt.
Because they are PC-based, virtual instruments inherently take advantage of the benefits from the latest technology incorporated into off-the-shelf PCs. Advances in technology and performance are quickly closing the gap between stand-alone instruments and PCs. Powerful processors such as the Pentium 4, operating systems and technologies such as Microsoft Windows XP, .NET, Apple’s Mac OS X, high performance buses such as PCI Express, and networking capabilities deliver unmatched price/performance results and flexibility.
In research and design, engineers and scientists demand rapid development and prototyping capabilities. Using virtual instruments, they can develop a program, take measurements to test a prototype, and analyze results in a fraction of the time required to build tests with traditional instruments. When flexibility is a requirement, a scalable open platform is essential — from the desktop, to embedded systems, to distributed networks. Engineers and scientists can easily modify or expand virtual instrumentation systems to adapt specific needs simply by reconfiguring an existing device.
This modularity is a result of a wide variety of low-cost plug-in hardware available for virtual instrumentation systems. Data acquisition boards provide accurate, reliable measurements ideal for a wide range of applications in the laboratory, with most featuring analog and digital I/O, and timing and triggering capabilities in multiple buses such as PCMCIA, PCI, and PXI. From digital multimeters, to high-speed digitizers, to RF measurement devices, there is a wide range of modular computer-based devices that deliver data acquisition capabilities at a significantly lower cost than that of dedicated devices.
As integrated circuit technology advances, making off-the-shelf components less expensive and more powerful, so do the boards that use them. With these advances in technology come an increase in data acquisition rates, measurement accuracy and precision.
LabVIEW circa 2003 – Software has empowered engineers and scientists to develop their own customized measurement and automation systems for a variety of applications on multiple platforms. Commercially available technologies have driven the innovation in virtual instrumentation, which has moved beyond the PC to leverage PDAs and FPGAs.
Software — the cornerstone of virtual instrumentation
Thomas Edison is widely recognized for innovations such as the phonograph and the incandescent light bulb, which he created working with small teams. Over the years, however, R&D has become the domain of large projects requiring hundreds, if not thousands, of engineers and scientists working together, often without fully understanding the complete scope of their project. With the emerging productivity advantages of PCs, software, and virtual instrumentation, we are able to return to small teams as a way for companies to best create breakthrough innovations and improve productivity.
Virtual instrumentation has led to a simpler way of looking at measurement systems. Instead of using several stand-alone instruments for multiple measurement types and performing rudimentary analysis by hand, engineers and scientists now can quickly and cost-effectively create a system equipped with embedded analysis software and reuse a single measurement device for a multitude of applications. Powerful, off-the-shelf software makes this possible. This software automates the entire process, delivering an easy way to acquire, analyze, and present data from a personal computer without sacrificing performance or functionality.
With LabVIEW, for example, scientists design custom virtual instruments by creating a graphical user interface through which they can operate the application program, control hardware, analyze acquired data, and display results. The similarity between standard flow charts and the data-flow programming nature of LabVIEW shortens the learning curve associated with traditional, text-based languages.
Although LabVIEW includes ready-to-run libraries required for most applications, it also is an open development environment based on its unique graphical programming paradigm. Standardization of software relies greatly on its ability to work well with other software, measurement and control hardware, and open standards, which define interoperability between multiple vendors. By selecting software that meets these criteria, scientists ensure that their company and applications take advantage of products offered by several suppliers. A large number of third-party software and hardware vendors develop and maintain hundreds of libraries and instrument drivers to help scientists easily use their products with industry standard development environments, thereby protecting their investment.
Moving beyond the PC
By leveraging the PC and commercial technologies, virtual instrumentation continues to deliver dramatic improvements to measurement and automation. Today, virtual instrumentation also is extending beyond the PC to take advantage of the newest technological innovations. For example, real-time and embedded control have long been the domain of specialized programming developments. Now, advancements in industry-standard technologies such as PDAs and Field Programmable Gate Arrays (FPGAs), including more reliable operating systems, more powerful silicon, and computer-based real-time engineering tools, are introducing new levels of control and determinism to virtual instrumentation. This presents new opportunities for scientists to take on increasingly sophisticated real-time and embedded development. Software can scale across development on the PC to real-time and embedded applications.
Open platforms such as PXI (PCI eXtensions for Instrumentation) make it simple to integrate measurement devices from different vendors into a single system that is easy to modify or expand as new technologies emerge or your application needs change. With a PXI system, you can quickly integrate common measurements such as machine vision, motion control, and data acquisition to create multifunction systems without spending valuable engineering hours making the hardware work together. The open PXI platform combines industry-standard technologies such as CompactPCI and Windows operating systems with built-in timing and triggering to provide a rugged, more expandable system than desktop PCs, while leveraging the same development software tools.
In addition, the Internet has ushered in a new age of data sharing and has spurred new networking and remote computing capabilities of virtual instrumentation that are simply not possible with their stand-alone proprietary counterparts. Virtual instrumentation takes advantage of the Internet so scientists can easily publish data to the Web directly from the measurement control device and read data on a handheld personal digital assistant or even on a mobile phone. Through virtual instrumentation, scientists can use the power of the Internet to control instruments remotely or collaborate on projects with colleagues in separate offices or countries.
Advancements in sensor technology also promise new dimensions to virtual instrumentation. A proposed sensor standard, IEEE P1451.4, defines new “smart” analog sensors that contain an embedded memory chip with standardized transducer electronic data sheets (TEDS) that store sensor information and parameters for self-identification and self-description. The sensors include serial digital links for accessing this information for plug and play operation. Using these smart sensors, scientists and engineers can take advantage of improved system configuration and diagnostics, reduced downtime, and improved sensor data management. Engineers and scientists can combine data acquisition and signal conditioning hardware to create measurement systems that communicate with both the analog and digital portions of smart TEDS sensors, read and manage TEDS data, and create and reprogram sensors.
Sensor vendors are also exploring how to expand plug and play capabilities to legacy sensors. Through a proposed online database of sensor vendors’ model data, users could download TEDS binary files, or Virtual TEDS. With Virtual TEDS, they could take advantage of new sensor technology with their traditional computer-based hardware, providing a smooth transition to the next generation of measurement and automation systems.
Advances in commercial technology from PCs, FPGAs, and ADCs/DACs to real-time operating systems and sensors, will continue driving virtual instrumentation to new heights. Leveraging commercial technologies, the hallmark of virtual instrumentation, saves valuable development time and integration time while reducing costs over traditional instrumentation solutions. No one can predict exactly where the future will take virtual instrumentation, but one thing is clear — the PC and its related technologies will be at the center, and engineers and scientists will be more successful as a result.
Tim Dehne is Senior Vice President, R&D, at National Instruments. He may be contacted at email@example.com.