EasyManua.ls Logo

National Instruments PCI-6035E - Instrumentation Amplifier (ni; A;D Converter; Ai Fifo; Analog Trigger

National Instruments PCI-6035E
265 pages
Print Icon
To Next Page IconTo Next Page
To Next Page IconTo Next Page
To Previous Page IconTo Previous Page
To Previous Page IconTo Previous Page
Loading...
Chapter 2 Analog Input
E Series User Manual 2-2 ni.com
Instrumentation Amplifier (NI-PGIA)
The NI programmable gain instrumentation amplifier (NI-PGIA) is a
measurement and instrument class amplifier that guarantees minimum
settling times at all gains. The NI-PGIA can amplify or attenuate an AI
signal to ensure that you use the maximum resolution of the ADC.
E Series devices use the NI-PGIA to deliver full 16- and 12-bit accuracy
when sampling multiple channels at high gains and fast rates. E Series
devices can sample channels in any order at the maximum conversion rate,
and you can individually program each channel with a different input
polarity and range, as discussed in the Input Polarity and Range section.
A/D Converter
The analog-to-digital converter (ADC) digitizes the AI signal by
converting the analog voltage into a digital number.
AI FIFO
A large first-in-first-out (FIFO) buffer holds data during A/D conversions
to ensure that no data is lost. E Series devices can handle multiple A/D
conversion operations with DMA, interrupts, or programmed I/O.
Analog Trigger
Refer to the
Analog Input Triggering section for information about the
trigger circuitry of E Series devices.
AI Timing Signals
Refer to the
Analog Input Timing Signals section for information about the
analog input timing signals available on E Series devices.
Input Polarity and Range
You can individually program the input range of each AI channel on your
E Series device. Input range refers to the set of input voltages that an analog
input channel can digitize with the specified accuracy.
The input range affects the resolution of the E Series device for an AI
channel. Resolution refers to the voltage of one ADC code. For example, a
16-bit ADC converts analog inputs into one of 65,536 (= 2
16
) codes that is
one of 65,536 possible digital values. These values are spread fairly evenly

Table of Contents

Related product manuals