I will explain the actual problem.
I am sensing liquid level whose measured value is affected by ambient temperature change. Sensor is calibrated for a particular temperature say (TC), but the measured value shows a deviation with a change in temperature. My aim is to find the error in measurement due to change in temperature and subtract it from measured value. For this purpose I measure both level as well temperature.
To observe the behavior w.r.t temperature I fixed the actual level constant and varied temperature for the range of interest and noted down the measured value at different temperature. It follows a linear variation for that level. I repeated the experiment at different levels and I found that the slope of the straight line is different. Slope decreases with an increase in level.
In real application, I will measure both level as well ambient temperature, from which I can calculate the difference in temperature from TC. I need to know what is the error caused by this change in temperature for that particular level. If the slope of above experiments are same then it is a straight forward calculation. But the slope also changes for different levels.
Hope the issue is clear.
What would be a practical solution for this problem?