measuring voltage of sensors
Posted: Sat Sep 29, 2018 10:25 pm
I would imagine others have thought of this, but I hadn't seen it suggested anywhere, so I thought I would pass it on to others.
There are instructions for how to calibrate the data logger to reference voltage for a sensor using a high quality volt meter. I have a better solution, and you don't need anything that you shouldn't already have, to do it.
If, for example, you want to calibrate a throttle position sensor for the min and max values for a linear scale display (you can also do this for mapping of course). Hook your sensor up to the analog channel you intend to use for logging that sensor.
Then, when you prepare to set the channel up, initially turn your data logger into a high quality, highly accurate volt meter. Simply in settings for that channel, select "volts" for units, and raw data on the first page. Then, whatever precision you want.
Then, bring up dash mode, and select the channel that you have named for your sensor, and directly measure and record the very precise volts displayed for min and max - and any other positions you want to measure.
Then, go back into the analog channel settings and input the values you just recorded, change units to whatever you need (such as % for TPS), and change raw data to linear or map as necessary, and set any other necessary parameters just as you otherwise would. I set up several analog sensors like this, and will be setting up additional sensors in the same manner in the near future. I found it much easier than using a multi-meter, and I didn't have to worry about my medium quality multi-meter being accurate enough for what I was doing.
You could even take this one step farther if you like. take your multi-meter, and compare the readings on your multi-meter to the data logger values. My bet is that the data logger is very accurate, so if there is any discrepancy, it's probably the multi-meter (unless you know the multi-meter is recently calibrated and accurate), and you then know, or at least have a good idea of the accuracy of your multi-meter.
There are instructions for how to calibrate the data logger to reference voltage for a sensor using a high quality volt meter. I have a better solution, and you don't need anything that you shouldn't already have, to do it.
If, for example, you want to calibrate a throttle position sensor for the min and max values for a linear scale display (you can also do this for mapping of course). Hook your sensor up to the analog channel you intend to use for logging that sensor.
Then, when you prepare to set the channel up, initially turn your data logger into a high quality, highly accurate volt meter. Simply in settings for that channel, select "volts" for units, and raw data on the first page. Then, whatever precision you want.
Then, bring up dash mode, and select the channel that you have named for your sensor, and directly measure and record the very precise volts displayed for min and max - and any other positions you want to measure.
Then, go back into the analog channel settings and input the values you just recorded, change units to whatever you need (such as % for TPS), and change raw data to linear or map as necessary, and set any other necessary parameters just as you otherwise would. I set up several analog sensors like this, and will be setting up additional sensors in the same manner in the near future. I found it much easier than using a multi-meter, and I didn't have to worry about my medium quality multi-meter being accurate enough for what I was doing.
You could even take this one step farther if you like. take your multi-meter, and compare the readings on your multi-meter to the data logger values. My bet is that the data logger is very accurate, so if there is any discrepancy, it's probably the multi-meter (unless you know the multi-meter is recently calibrated and accurate), and you then know, or at least have a good idea of the accuracy of your multi-meter.