2600S-901-01 Rev. C / January 2008 Return to Section Topics 12-37
Series 2600 System SourceMeter® Instruments Reference Manual Section 12: Instrument Control Library
Attribute The precision (number of digits) for all numbers printed with the ASCII format.
Usage precision = format.asciiprecision-- Reads precision.
format.asciiprecision = precision-- Writes precision.
precision Set from 1 to 16.
Remarks • This attribute selects the precision (number of digits) for data printed with the print,
printnumber and printbuffer functions. The precision attribute is only used with
the ASCII format. The precision must be a number between 1 and 16.
• Note that the precision is the number of significant digits printed. There will always be
one digit to the left of the decimal point. Be sure to include this digit when setting the
precision.
• The default (reset) precision is 6.
Also see format.byteorder, format.data, printbuffer, printnumber
Example Sets the ASCII precision to 7 digits and prints a number:
format.asciiprecision = 7
print(2.5)
Output: 2.500000E+00