PR33-16 : Confusing Baseline Values

Hello all,

I’m trying to get my PR33-16 working, and I’ve got a little script working in C# that can poll the two chips, and pull the readings from them.

Thing is, the readings coming out have no correlation to reality. I’m trying to read two 24V loop powered 4-20mA transducers, and six independently powered 4-20mA transducers. However, even with everything else powered off, I’m getting random values between -43 and 34 mA.

My calculation is as prescribed, and is transcribed below:

            dataConv = dataIn[3];
            dataConv = (short)((int)dataConv << 8);
            dataConv += dataIn[4];
            double result =(dataConv) * 0.001380;

As for configuration, I’m using a gain of 2, continuous calculation (tried single mode too, but that didn’t do anything), and a sample period of 15 samples /sec (16 bits). My round-robin measurement time is 100ms per channel + 1000ms delay between iterations.

If anyone has some advice, I am all ears.I’m open to inputting calibration figures, but with a spread of literally the whole range, something else must be the matter.

I figured out the issue.

Turns out, I was misinterpreting the I2C responses. I was using the checksum as the second data packet, and disregarding the first data packet as some sort of flag, or something. The proper return data looks like:
0xAA 0x03 [Data packet 1] [Data packet 2] [Configuration echo] [Checksum]