1-Channel Amp Sensor Calibration Value Issue

Hello,

We have a 1-Channel Wireless Current (Amp) Sensor modified with a Setra 5-500 Amp CT Clamp.

We were attempting to change the sensors delay from the default 10 min to 2 min (using Node Red), and unfortunately, the only option to change that delay also requires one to enter the new calibration factor for the sensor (side note, this should really be a separate function IMO).

Previously this has not been an issue, and we have been able to recalibrated the sensor with a new calibration factor.

However, with this one, we were not able to correctly back calculate the calibration factor based on “actual” vs “sensor amps”. See the table below. As you can see, we were trying to get the sensor to read about 83 amps, but there was a massive change in sensor readings with only a few decimal change in the calibration factor.

image

What is the issue here?

At 0.0185 factor, we were reporting 67 amps. But at 0.022 factor we were reporting 112 amps. No matter what we tried, we couldn’t get it to report 83 amps.

The calib doesnt need to be set ( its optional)

This is what we see:

Is this a recent Node Red NCD Package Change?

Regardless, what is the issue with our approach to calibrating the sensor so that it reads the correct value? We are using this approach:

what node red lib are you using ?
set calib value to 87

Running Node Red 1.2.9

And NCD Packages below.

This was the original Node Red Instance as installed on the Original IoT Edge Computer.

We have other edge computers with updated node red/ NCD palletes, but this is the one have deployed.

We will need to try and configure this in a few days when we are on site.

Also, why 87? A calibration Value of 0.021 is already too much based on our tests.

we want to start with a known value and works our way through.

Hi Bhaskar, we attemped this calibration again with TWO different edge computers: (1) a New Robustel Gateway running the latest version of Node Red/NCD Enterprise Pallette, and (2) an older NCD Edge Computer running the Wireless Sensor Pallete.

No matter what we tried, we were not able to hone in on a good value:

1: Using Older edge Computer

Actual Amps NCD Current Monitor Calibration Factor
93.2 12125.856 606
88.2 9505.856 606
82 2607 0.56
88.1 2630 0.56
165.1 84.009 0.018
164 224.91 0.02

From this it appears the Cal Factor should be betwen 0.018 and 0.02 (this is similar to the original post)

2: Using Robustel Gateway (Calibration Factors seem to be 10x that of the older edge computer calibration)

Actual Amps NCD Current Monitor Calibration Factor
164.2 9078.272 606
165 0.872 10.96
163.7 1.375 10.96
163 1.417 10
164.5 44.467 1150
86 576.631 4254
163.5 0 634.45
163.1 0 634

From this round of testing there was no clear trend emerging.

What could be the issue be? This sensor was working perfectly until we tried changing the sensor reporting time (which resulted in the cal factor being reset, and this situation).

What else can we do?

what’s the current sensor firmware version ?

It says Firmware V 2

It gets bit tricky to test this one because it was custom built.
lets try this
using xctu

  1. set the modem pan id to 7BCD
  2. send this command to read calibration value 7E 00 13 10 00 00 00 00 00 00 00 FF FF FF FE 00 00 F4 02 00 00 0D F1
  3. send this command to set calibration value to 500 – 7E 00 15 10 00 00 00 00 00 00 00 FF FF FF FE 00 00 F4 01 00 00 0D C3 50 DF
  4. read current and share current value

we looked into adding this to node red but it might take bit.

Hey, OK we are attempting this now, but we are a bit confused:

Do we connect the 1-channel amp sensor directly to our computer and do this? If so, where is the USB port to connect this?

Because since this is a battery powered sensor, XCTU does not “see It” as it is set to sleep and wakeup every x min. So not sure how to send/receive these commands.

  1. go to xctu and open the modem. you will need a USB modem
  2. change pan id to 7BCD
  3. open xctu terminal


5. put sensor in cfg mode and you will get 28 byte long packet
6. click on the + sign
7.

8. paste the commands i sent here and the click sends selected frame one by one and share the resposne device sends

Jacob just pushed a node red change. where you can config using node red. pull the temp branch.


enable this and send 50000 as the calibration value

Awesome, thanks for the explanation and for making a temp fork of the code.

We used the Node Red Temp Fork, and here is what we found:

Screenshot 2024-08-29 at 4.32.58 PM

Screenshot 2024-08-29 at 4.31.00 PM

Basically, when we put in a number like 50,000, we get an absurdly high amp reading.

When we put in ~1.6, we are getting close, but it is still bouncing around.

This is the behaviour we were seeing originally (Screenshot of original post)
image

We don’t have extra time to play with this at the moment, so we will continue tomorrow, but for now, its very clear that the calibration of this device (which is a 400 amp CT Clamp) requires a VERY low calibration value (like 1-2 vs 60,600 which is the default).

What would the reason be for this?

We will continue to play with this a bit.

Question: With this temp fork, are there are decimal constraints? In other words, how many decimal points will it accept? It seems to accept at least 1.

Thank you as always

Ok so lets set the calibration value to 500( dont multiply by 100) and measure current.

Silly question, but does the sensor return an amperage in AMPS or MILLIAMPS?

Because when we received it, it was in amps, but I know other sensors use milliamps.

for the CT you have the calibration value is 28927.

Sensor sends data in mA but node red converts in Amp.

Hi @Bhaskar I think I figured out what is going on after conducting more tests:

Calibration Value Real Amps NCD Sensor Amps New Calculated Calibration Value Difference in Amps Comments
500 45.6 12488.56 1.826 273.872 Reset to 500 Cal value
500 45.5 12493.955 1.821 274.592 Reset to 500 Cal Value
28927 45.6 6530.432 201.988 143.211 Tried Default Value as Per Bhaskar 28927
28927 45.6 10065.344 131.051 220.731 Tried again, no cal, just reading. Sensor is reporting inconsistent data.
28927 45.5 1659.776 792.986 36.479 Tried again, no cal, just reading. Sensor is reporting inconsistent data.
28927 45.4 8865.28 148.138 195.270 Tried again, no cal, just reading. Sensor is reporting inconsistent data.
28927 45.5 6823.104 192.900 149.958 Tried again, no cal, just reading. Sensor is reporting inconsistent data.
28927 45.6 2990.336 441.111 65.578 Tried again, no cal, just reading. Sensor is reporting inconsistent data.
1.82 45.6 32.476 2.555 0.712 Tried to goto 1.82 as per initial 2 readings.
1.82 45.5 32.456 2.551 0.713 Consistent
1.82 45.6 32.605 2.545 0.715 Consistent
1.82 45.6 32.933 2.520 0.722 Consistent
2.55 45.7 57.599 2.023 1.260 Trying 2.55
2.55 45.6 57.716 2.015 1.266 Consistent
2.55 45.7 57.883 2.013 1.267 Consistent
2 45.7 57.777 1.582 1.264 Trying 2 Now as per last calculation
2 45.7 57.311 1.595 1.254 Didn’t change calibration
2.0 45.7 57.317 1.595 1.254 Trying 2.0 instead of just 2. Didn’t change anything
1.59 45.7 32.235 2.254164728 0.70536105 Trying 1.59
1.59 45.7 32.501 2.235715824 0.711181619 Now its back to 32 amps
1.8 45.5 32.287 2.536624648 0.709604396 Just trying 1.8. didn’t seem to work. Seems to be ignoring decimals
1 45.7 32.33 1.413547788 0.707439825 Testing to see if 1 (without decimal) reports 32 AMPS as well to confirm theory about decimals. Seems to be the case!
1 45.7 32.339 1.413154396 0.707636761 Seems to report 32 amps, and ignore decimals
3 45.7 81.92 1.673583984 1.792560175 Trying without decimal to test my theory
3 45.6 83.096 1.64628863 1.822280702 Consistent reading.
10 45.7 258.767 1.766067543 5.662297593 Trying 10 to scale. No decimals
10.9 45.7 253.292 1.966623502 5.54249453 Tryingt 10.9 to see if .9 makes a diference. It doesn’t/

I believe the issue is that the Calibration is IGNORING decimals. Based on my testing, the sensor is responding linearly when using whole number calibrations:

Cal Factor NCD Sensor Output Actual Amps
1 32.3 45.6
2 57.7 45.6
3 83.1 45.6
10 258.7 45.6
500 12488.56 45.6

image

Based on the data presented, it appears that:

  1. The optimal Calibration Factor should be between 1.5 and 1.8.
  2. However, the system seems to be ignoring decimal places in the Calibration Factor:
  • Any value between 1.0 and 1.9 is treated as if it were exactly 1.0.
  • When the value reaches 2.0, there’s a sudden jump in the sensor’s reported output.
  1. This behavior suggests a limitation in how the Calibration Factor is being processed:
  • It’s likely that the software or Node-RED package is truncating the Calibration Factor to an integer.
  • This prevents the use of precise calibration values needed for accurate measurements.
  1. Proposed solution:
  • The issue could potentially be resolved by modifying the software or Node-RED package.
  • The modification should allow for decimal places in the Calibration Factor, enabling more precise calibration.

What do you think? Could this be the case? If so, can NCD provide a fix by updating the temp branch?