A given sensor has a specified linearity error of 1 % of the reading plus 0.1 % of the full-scale...

A given sensor has a specified linearity error of 1 % of the
reading plus 0.1 % of the full-scale output (FSO). A second sensor having the
same measurement range has a specified error of 0.5 % of the reading plus 0.2 %
FSO. For what range of values is the first sensor more accurate than the second
one? If the second sensor had a measurement range twice that of the first one,
for what range of values would it be the more accurate?