(sample temperature+ Temperature difference δ)/10=display temperature value. Then
temperature display value can equal or close to the real temperature value. This
parameter has sign (negative or positive). Unit is 0.1℃, the default value is 0.
Control cycle’s range is 0.5s~200s, the minimum precision is 0.1s. The write value is
the real temperature control cycle multiplies 10. i.e. 0.5s control cycle should write 5,
200s control cycle should write 2000.
If users think the environment temperature is different with the display temperature, he
can write in the known temperature value. At the moment of value written in, calculate
the temperature difference δ and save.
Calculate the temperature difference value δ=adjust environment temperature value-
sample temperature value. Unit: 0.1℃.
E.g.: under heat balance status, user test the environmental temperature as 60.0℃ with
mercurial thermometer, the display temperature is 55.0℃ (correspond sample
temperature is 550), temperature difference δ=0. at this time, users write this
parameters with 600,temperature difference δ is re-calculated to be 50 (5℃), then the
display temperature = (sample temperature+temperature difference δ)/10 =60℃。
**Note: when users write the adjust temperature value, make sure that the temperature
is same with the environment temperature value. This value is very important, once
it’s wrong, temperature difference δ will be wrong, then effect the display temperature
The output when auto tune, use % as the unit, 100 represents 100% of full scale
output. 80 represent 80% of full scale output.