I will explain the actual problem.
I am sensing liquid level whose measured value is affected by ambient temperature change. Sensor is calibrated for a particular temperature say (TC), but the measured value shows a deviation with a change in temperature. My aim is to find the error in measurement due to change in temperature and subtract it from measured value. For this purpose I measure both level as well temperature.
To observe the behavior w.r.t temperature I fixed the actual level constant and varied temperature for the range of interest and noted down the measured value at different temperature. It follows a linear variation for that level. I repeated the experiment at different levels and I found that the slope of the straight line is different. Slope decreases with an increase in level.
In real application, I will measure both level as well ambient temperature, from which I can calculate the difference in temperature from TC. I need to know what is the error caused by this change in temperature for that particular level. If the slope of above experiments are same then it is a straight forward calculation. But the slope also changes for different levels.
Hope the issue is clear.
What would be a practical solution for this problem?
I have a problem here with linear equations.
I got a set of straight lines whose slopes vary from 1 to 5 (linearly) and y-intercept vary from 0 to 100.
I have a point (x,y) whose slope is in between 1 and 5 and y-intercept is between 0 and 100. Is there any method to find the equation of line passing through (x,y) with which I can find y-intercept of that line?