Figure 1. A typical Data Acquisition System.
Since the system is integrated with National Instruments products, it offers direct control of all hardware on the DAQ board from the LabVIEW software. LabVIEW is the emerging standard in visual programming based instrumentation control systems. LabVIEW is programmed with a set of graphical icons (called "G") which are connected with "wires". The combination of a DAQ board and LabVIEW software makes a virtual instrument or a vi. A vi can perform like an instrument and is programmable by the software with the advantage of flexibility of logging the data that is being measured.
Traditionally, measurements are done on stand alone instruments of various types-oscilloscopes, multi meters, counters etc. However, the need to record the measurements and process the collected data for visualization has become increasingly important.
There are several ways in which the data can be exchanged between instruments and a computer. Many instruments have a serial port which can exchange data to and from a computer or another instrument. Use of GPIB interface board (General purpose Instrumentation Bus) allows instruments to transfer data in a parallel format and gives each instrument an identity among a network of instruments. All HP instruments in the EE Undergraduate Laboratories and PCs are equipped with GPIB interfaces.
Another way to measure signals and transfer the data into a computer is by using a Data Acquisition board. A typical commercial DAQ card contains ADC and DAC that allows input and output of analog and digital signals in addition to digital input/output channels.
In the following overview we will attempt to explain various aspects of a DAQ card and DAQ system used in the EE Undergraduate Lab.
The data is acquired by an ADC using a process called sampling. Sampling a analog signal involves taking a sample of the signal at discrete times. This rate at which the signal is sampled is known as sampling frequency. The process of sampling generates values of signal at time interval as shown in following figure.
Figure 2 Sampling Process
The sampling frequency determines the quality of the analog signal that
is converted. Higher sampling frequency achieves better conversion of the
analog signals. The minimum sampling frequency required to represent the
signal should at least be twice the maximum frequency of the analog
signal under test (this is called the Nyquist rate). In the following figure
an example of sampling is shown. If the sampling frequency is equal or
less then twice the frequency of the input signal, a signal of lower frequency
is generated from such a process (this is called aliasing).
Figure 3 Effects of Sampling and aliasing due to undersampling
Once the signal has been sampled, one needs to convert the analog samples into a digital code. This process is called analog to digital conversion. This is shown in Figure 4
Figure 4. Analog to Digital Conversion for a 3-bit ADC
Commercially available boards have different sampling frequencies. The DAQ boards in the EE Undergraduate Lab have a 12 bit ADC with sampling rate up to 40 KHz. There are 3 high speed DAQ boards that have a 12 bit ADC with sampling rate up to 100 KHz.
Most boards also have a multiplexer that acts a like a switch between different channels and the ADC. Therefore with 1 ADC, it is possible to have a multichannel input DAQ board. All boards in the EE Undergraduate Lab are 16 channel analog input boards. This makes it possible to acquire up to 16 analog signals in parallel (however, the sampling frequency will be divided by the number of parallel channels).
Precision of the analog input signal converted into digital format is
dependent upon the number of bits the ADC uses. The resolution of
the converted signal is a function of the number of bits the ADC uses to
represents the digital data. The higher the resolution, the higher the
number of divisions the voltage range is broken into, and therefore, the
smaller the detectable voltage change. A 8 bit ADC gives 256 levels (2^8)
compared to a 12 bit ADC that has 4096 levels (2^12). Hence, 12 bit ADC
will be able to detect smaller increments of the input signals then a 8
bit ADC. All DAQ boards in the EE Undergraduate lab have a resolution of
12 bits. LSB or least significant bit is defined as the minimum increment
of the voltage that a ADC can convert. Hence, LSB varies with the operating
input voltage range of the ADC. Figure 5 illustrates the resolution for
a 3 bit ADC. FS stands for full scale and LSB is the least significant
bit. If the full scale of the input signal is 10V than the LSB for a 3-bit
ADC corresponds to 10/2^3=1.25V. That is not very good! However, for a
12 bit ADC the least significant bit will be 10/2^12=10/4096=2.44mV. If
one needs to detect smaller changes, one has to use a higher resolution
ADC. Clearly, the resolution is an important characteristic of the DAQ
Figure 5. Resolution of ADC, X axis is analog input
Ideally if the voltage applied to the input of an ADC is increased linearly, we would expect the digital codes to increment linearly as shown in figure 6.
Figure 6. Transfer characteristic of a ideal ADC.
A perfect DAQ board will have no non-linearity but most commercially
available boards display some non-linearity. This specification is clearly
stated as differential non-linearity. Following figure (Figure 7)
shows the result of non-linearity.
Figure 7. Differential Non-linearity of ADC
On a typical board, the analog signal is first selected by a multiplexer, then amplified before it is converted by the ADC. The amplifier used between multiplexer and ADC must be able to track the output of the multiplexer, otherwise the ADC will convert the signal that is still in transition from the previous channel value to the current channel value. Poor settling time is a major problem because it changes with sampling rate and the gain of the DAQ board
Data Transfers to the computer
Typically, DAQ boards are installed in a PC with high speed data bus like PCI. Depending on the speed of the motherboard of the PC, the maximum data transfers can occur between microprocessor and memory at 20 MHz to 40 MHz. To improve the data transfers, bus mastering (allowing DAQ board to transfer data directly) is implemented as shown below in figure 8 and 9.
Figure 8. Data transfer without bus mastering (conventional)
Figure 9. Data transfer with bus mastering (used in expensive DAQ boards)
As you may now conclude, that sampling frequency and resolution are very important factors in determining the performance of a DAQ card. But, in addition to the sampling speed, there are other factors that can affect the functionality of a DAQ system.
Digital to Analog Converter
The multifunction boards also have on-board digital to analog converters
(DAC). A DAC can generate an analog output from a digital input. This allows
the board to generate analog signals, both dc and ac voltages. Like the
ADC, the DAC's performance is limited by the number of samples it can process
and the number of bits that is used in converting the digital code into
an analog signal.
Figure 10. Sine wave generation from a 3-bit digital code.
Figure 10 shows how a sinusoid waveform is converted by a 3 bit DAC. A DAC with small settling time and high slew rate (rate at which a amplifier can respond to change in input) can generate high frequency signals, because little time is needed to accurately change the output to a new voltage level.
Using high performance DAQ cards and fast computers, and data processing software like LabVIEW, one can achieve performance similar to expensive bench top instruments. The virtual instruments (vis ) can therefore control an output, process the input signals and log the data.
A DAQ system in the EE Undergraduate Lab is shown below:
Figure 11. A typical DAQ card and accessories.
The DAQ system shown above is a E series board made by National Instruments. This board has a sampling frequency of 100 KHz, a 12 bit ADC, counter, timer, digital signals in/out and a DAC. For detailed specs, please look at the board specifications.
In the following figure a complete DAQ system with LabVIEW is shown. The driver software is a lower level driver that interfaces LabVIEW software with the DAQ boards. As a user of LabVIEW one does not have to worry about configuration and control of components within DAQ boards. LabVIEW identifies each board by a device number and therefore one can have as many devices as many as the computer can accept on their expansion slots. LabVIEW can also combine and display inputs from various sources like inputs from serial and parallal port, data acquisition board (s), and GPIB boards on a single interface as shown in the figure below.
Figure 12. LabVIEW Software and DAQ system
LabVIEW is programmed with set of icons that represents controls and
functions, available in the menu of the software. Such a programming is
called visual programming and National Instruments calls it G
. The user interface which is called a vi consists of two parts-
a front panel and a diagram . This is similar
to that of an instrument where a front panel is used for a input, output
controls, and to display the data whereas the circuit resides on the circuit
board. Similarly you can bring the buttons, indicators and graphing and
display functions on the front panel as shown below:
Figure 13. Example of different front panel vis
One can configure a vi to include functions and graphs that are fully
customizable. In an example shown below, a temperature monitoring system
consists of data acquisition from a thermometer and plotting on a strip
chart recorder. The vi also calculates mean and standard deviation of the
data and plots it. Alarms have been set up so that if the temperature falls
below or above a certain set value the alarm goes ON.
Figure 14. Temperature System vi front panel
The diagram of such a vi can look very complex but surprisingly it is
not very difficult to learn. There is no syntax to be learnt in LabVIEW.
Please read "My first vi" to understand how a simple DAQ can be done in
LabVIEW. The diagram of the vi front panel shown above is shown in Figure
Figure 15. Temperature System Demo diagram
When data acquisition is performed, the software needs to know the following information:
Often LabVIEW is used to perform system simulations, since it contains
many commonly used filter, digital signal processing, and statistical functions.
LabVIEW compiles almost as fast as C or Matlab and therefore one can perform
complete simulation within a vi. Following figure shows various functions
available in the LabVIEW Functions menu.
Figure 16. Trigonometric Functions
Examples shown below are available from Functions menu
Figure Statistical Functions
Figure 17. Regression functions
In addition to data input output, LabVIEW can access serial ports, parallel ports and GPIB cards to read data from instruments that have a GPIB interface. As you can see, the possibilities of "virtual instrument" are almost limitless and they expand the measurement capabilities of the EE Undergraduate Laboratory considerably. User manuals and details vi reference manuals are available in the EE Undergraduate Lab for in-depth understanding of different aspects of LabVIEW.
1. "LabVIEW - Graphical Programming for Instruments, User Manual", National
2. L.K. Wells, "LabVIEW Student Edition User's Guide", National Instruments, 1995.
3. "Data Acquisition Basics Manual", National Instruments, 1996.
Back to the EE205 Homepage
Back to the EE Undergraduate Lab Homepage