If the decision has been made to go with off-the-shelf software functionality, it is important to remember that many data analysis programs do not offer data acquisition. This necessitates the purchase of a separate data acquisition program, and the integration of this program with the data analysis and presentation program.
Another scenario dictating the separation of data acquisition and data analysis is when budget constraints do not allow for the purchase of an integrated data acquisition and analysis program. The low cost option in this case is to procure a data acquisition software program only capable of acquiring data, storing data, and exchanging this data with an application software program. These data are then made available for access by any program through standard data exchange formats such as DDE. These data-acquisition-only programs have limited functionality and correspondingly low costs. They are often supplied with data acquisition hardware and bundled into the hardware price.
A data-acquisition-only program can interface to a spreadsheet program via DDE or OLE. The spreadsheet program then can display these data in either a tabular or graphical format. Data also can be analyzed using the standard functions available with the spreadsheet program. This approach can be used for any type of data analysis and presentation program.
Drawbacks to this approach include slow rates of data exchange, and a lack of tight integration between the programs. Upgrades to one of the programs may cause difficulties with the other despite the presence of industry standard data exchange formats. Finally, vendors of data-acquisition-only software tend to be small in size compared with vendors of data acquisition/analysis/display software-this heightens the risk of obsolescence.
If these drawbacks are not critical, then it is possible to select the best data analysis and presentation program without regard to data acquisition functionality. If a user must have an integrated data acquisition/analysis/presentation solution, then selection of the best available software must be made from a more limited range of offerings.
The task of acquiring data acquisition software consists of five primary steps, ideally followed in the order given below:
1)Define the required data acquisition, analysis, and presentation features.
2)Determine if custom, semi-custom, or non-custom software contains the required features.
3)Select the custom, semi-custom or off-the-shelf software option based on the application and the available expertise.
4)If the custom option is selected, purchase the required tools and locate the right programmer(s) and the right application expert.
5)If the semi-custom option is selected, purchase the development environment and locate the right configuration expert and the right application expert. This would ideally be an application expert with configuration training and/or experience.
If the non-custom option is selected, purchase the off-the-shelf program and locate the right application expert. Have this application expert learn how to use the program.
If the custom option is selected, features are only limited by the ability of the programmer and the resources available. If the off-the-shelf, non-custom option is selected, a careful study must be made of available products to determine the software program best suited for the application.
The first step is to determine what data needs to be collected, and what type of analysis and presentation is required for the data. Options range from simple plotting of an analog variable versus time to extremely sophisticated mathematical analysis of multiple analog and discrete values. Data analysis techniques can be divided into three main areas as shown in Figure 6-4. Each of these types of data analysis software will be examined in detail in sections to follow, starting with the most widely used application: process monitoring.
Process monitoring is the simplest and most widely used of the data analysis techniques. Typical application examples include:
Monitoring the setpoint, the output, and the process variable versus time to aid in PID loop tuning;
Plotting one analog variable versus a second analog variable;
Alarming variables when alarm points are reached, or when a variable exhibits certain behaviors such as a rapid change in value;
Filtering variables to eliminate noise problems;
Plotting historical data, including overlaying current data on historical data for analysis; and,
Comparing sets of measured variables to desired results (whether manually entered or automatically generated).
As its name implies, process monitoring is typically used in process industries to analyze the batch and continuous production of various products. At the most basic level, software-based systems replace functions previously performed by panel instruments. This is done by creating a numerical and/or a graphical representation of the panel instrument. For example, a display of analog data is an emulation of a digital panel meter. Bar-graph meters become bar-graph displays. A chart recorder is depicted as a graph with different colored trend lines replacing multi-colored inks. If correctly implemented, these techniques allow an operator only familiar with panel instruments to become instantly familiar with data displayed on a CRT.
Panel instrument replacement is a basic function of virtually all process monitoring software. The software also adds many other capabilities that build on these basic functions. Displayed data can be stored for later analysis or record-keeping purposes. New methods of data visualization can be implemented to create physical representations of plant processes.
For example, the liquid in a vessel can be shown to rise and fall based on the value of the corresponding analog variable. Liquids and gases can be shown to fill not only vessels, but also transfer lines. The status of various on/off devices can be shown by changing the color of the item, or by animating the display. A pump rotor can be shown to rotate if the pump is on. The range of display options is limited only by the imagination. The main drawback of this flexibility is the possible incoherence of the data display. Good screen design is required to ensure clear understanding by plant operating personnel.
Another powerful technique available with process monitoring software is the emulation of specialized laboratory instruments such as oscilloscopes. The software can be used along with the PC and the display to create a virtual instrument capable of equaling or outperforming its real world counterpart. In addition, the virtual instrument can simultaneously emulate many different types or instruments, resulting in substantial cost savings.
The process monitoring features described above emulate the functions of panel and laboratory instruments. Sophisticated mathematical data manipulation software opens up an entirely new realm of data analysis and presentation.
In-depth analysis of data often requires the derivation of mathematical relationships, either among data or between data and other parameters. Many software programs are available to perform these types of functions. Most of these programs are not capable of data acquisition and only perform data analysis and presentation. These data analysis and presentation programs are designed to work in conjunction with data acquisition programs. The cost of these programs is usually less than for programs combining data acquisition with data analysis and presentation. They tend to be widely used not only for the analysis of collected data, but also for off-line analysis of entered or stored data.
This class of software programs can perform calculations on either real-time data (concurrent processing) or stored data (post processing). Multiple channels of data can be plotted against time, frequency, or other data. Zooming is supported along multiple axes. Plotted waveforms can be analyzed to yield minimum, maximum, and average values. Successive waveforms can be averaged to reduce the effects of noise.
Data can be stored to disk or other storage medium, either as raw data or in a processed format using data compression. For example, an analog variable might be sampled once a second and analyzed for variability. Only variations greater than one percent would trigger data storage.
|Figure 6-4: Primary Data Analysis Techniques
||Analog and discrete data are collected and displayed in a graphical format.
Data is usually displayed on a graph versus time, or another variable
||Data is analyzed using various high-level mathematical techniques such as
fast Fourier transform (FFT), curve fitting, filtering, etc.
||Data is analyzed using various statistical techniques including X-bar and
R-bar charts, frequency histograms, Chi-square analysis, etc.
Most of these programs support standard mathematical calculations of arithmetic, trigonometric, calculus, logic, correlation, and basic statistical functions. These functions can be performed on single data points or a group of data points (waveforms).
One of the most popular mathematical data manipulation techniques is fast Fourier transform (FFT). FFT is used to convert time-domain data to the frequency domain. This allows frequency spectrum analysis of data. One leading program calculates the amplitude and phase for multiple channels of data. The difference in phase angle is calculated to provide the transfer function, impedance, and frequency response. The power spectral density is specified along with the relative power for specified frequency bands. These calculated data can then be displayed on graphs in either linear of log format.
Another popular mathematical data manipulation technique is curve fitting. Software programs are available to plot data on two (curves) or three axes (shapes), and to analyze these curves and shapes to derive mathematical expressions. Typical curve-fitting mathematical functions are linear, polynomial least squares up to the tenth order, cubic spine, and point-to-point. User-defined expressions can be compared to curves to determine the degree of fit. Curve-fitting equations, both derived and user-defined, can be ranked in terms of best fit.
For each set of data points the standard deviation and the correlation coefficient can be calculated in addition to the curve-fitted equation. Equations can be stored, and the recalled equations can be plotted against real-time or other stored data. Most curve-fitting software programs can also generate curves and shapes from tabular data.
Some common applications for curve-fitting software include the generation of calibration curves for flowmeters, optimization of process control parameters, and the generation of logistic dose response curves (concentration versus time and dose) for pharmacological research. Built-in kinetic equation functions can be used to study reactions and to describe biochemical responses. Kinetic functions also can be used to study the disappearance of a reactant (decay) and the appearance of a product (formation) over time. These kinetic models can determine the actual order of the formation or the decay.
Another related type of mathematical data manipulation is peak fitting. Peak-fitting software provides nonlinear peak separation and analysis, primarily for spectroscopy, chromatography, and electrophoresis applications. Overlapping peaks can be located and separated to aid in analysis. Noisy data can be processed using advanced data smoothing and manipulation techniques. Deconvolution techniques can be used to remove the data smearing often induced by data acquisition instruments.
Functions specific to spectroscopy applications include the identification of different nonlinear spectral application line shapes including the Gaussian, the Lorentzian, and the Voigt. Information reported for each peak includes amplitude, area, center, and width data. Overall area is determined by integrating the peak equations.
Chromatography applications are supported by equations such as the Exponentially-Modified Gaussian, the Haarhoff-Van der Linde, the NLC, and the Giddings. Data reported include column efficiency, resolution, first moment, second moment, and center as well as peak width at base and half maximum, and asymmetry at base and 10% of maximum. Electrophoresis applications can be analyzed by separation of up to 100 peaks, even if they overlap.
New software programs are constantly being released to address specific types of mathematical analysis applications. Another area where data acquisition software is widely used is for statistical analysis.
Statistical analysis software is widely used in industrial applications for quality acceptance, quality control, and statistical process control (SPC). These types of software programs analyze data to derive information with respect to the statistical relationships between or among data points. The most common supported functions include the calculation of the mean, the standard deviation, the range, the moving average, the moving range, and the cumulative sum. Other supported statistical functions include Chi-square analysis, Pareto analysis, distribution analysis, non-parametric analysis, correlation, single variable regression, multiple linear regression, and random number generation.
Many different types of control charts can be generated including X-bar, R-bar, histograms, and probability plots. These functions and charts are used in a wide variety of industrial applications.
A typical SPC application would be to measure the weight of a box of cereal. The actual weight would be compared to the desired weight and also to the boundary limit high and low weights on an X-bar chart. The software program would produce alarms to show a single unacceptably low or high weight; unacceptable weights could also be anticipated by alarming certain conditions such as successive weights trending towards a boundary limit, successive erratic weights alternating between high and low limits, or a number of sequential measurements close to a high or a low limit. Anticipation of an alarm condition could allow plant operations personnel to correct a problem before it occurs, thus avoiding unacceptable product.
Many of these software programs were originally written for applications not involving data acquisition. Data was expected to be entered manually or imported from other software programs. Support of data interfaces is often minimal. Fortunately, there are hardware and software methods available to address this issue.
Once a decision has been made as to what type of data analysis and presentation is required, the next step is to ensure a compatible interface with data acquisition instruments. Ideally, software selection would be made first based on the features and options offered by the selected software vendor. The data acquisition hardware would then be selected based on compatibility with the software.
This approach is not always viable for two reasons. One, the data acquisition hardware may already exist or may already have been purchased. Two, the type of data to be acquired may be specialized in nature and require the use of certain instruments with a limited number of data interface options.
Software programs acquire data by interfacing to data acquisition hardware, either plug-in cards or standalone systems. Communication to plug-in cards is via the computer bus, and communication to standalone systems is via industry standard serial data exchange formats like RS-232, RS-422/485, and IEEE-488. The selected software must be compatible with the data acquisition hardware, or a bridge must be used.
The bridge may be either a software or a hardware solution. A hardware solution accepts inputs from field devices and instruments and converts these inputs into outputs compatible with the selected software. A software solution processes input data and makes these data available to the data presentation and analysis program via a standard dynamic data transfer format such as DDE.