I am using a PR33-34 (labeled PR33-14). The lsb conversion factor listed on the NCD web page is 0.007346 V/lsb. After reverse engineering the frontend of the PR33-34 the calculated value appears to be 0.000741V/lsb. Then comes the empirical data; I connected a power supply and calibrated DMM to one channel of the PR33-34 and stepped the voltage 1 volt at a time up to 24V.
(PR33-34, Channel 4, 16 bit sample size, Gain = 1)
Input V RAW Hex Raw Dec Conversion Factor
1 0x0497 1175 0.000851
5 0x199E 6558 0.000762
10 0x33F8 13304 0.000752
15 0x4E4F 20047 0.000748
20 0x68B3 26803 0.000746
24 0x7dCC 32204 0.000745
So a number of questions have come to mind:
- Have I made a mistake when calculating the conversion factor? 0.0007346 stated vs 0.000741 calculated.
- Is it normal for the low end of the input range to be non-linear (in regards to the conversion factor)?
- Should one included a separate calibration factor for each channel of the PR33-XX board?
- Is there a users guide available for the PR33 series that documents the various resistor values used to modify the front end gain. My board is clearly mis-labled (as a PR33-14 and is really a PR33-34) It took me a while to figure this out. I noticed this topic in another post as well.
Any insight on these questions would be appreciated.