I have been given a problem where someone is driving a car whose speedometer is off by some constant c (e.g. if the speedometer shows 30, but the true speed is 45, then c=15)
They begin to log their n drives, each time taking note of the distance traveled and speedometer reading during that drive. (assume the speed was constant throughout the whole ride)
I am given the total time t driven over all the drives, but not the time for each individual drive.
I am supposed to find the error, c, given n, t, and d1,d2,...,dn and s1,s2,...sn where d(i) is the distance traveled during a drive and s(i) is the speedometer reading on that drive
So far I've found that t = [d1/(r1+c)]+[d2/(r2+c)]+...+[dn/(rn+c)]
I'm wondering how to algebraically solve this equation for c
Any help is appreciated (or perhaps there is a better way to solve this problem, any help in that direction would be appreciated too)