Let a, b, and c be three distinct one-digit integer numbers. What is the maximum value of the sum of the roots of the equation (x-a)(x-b) + (x-b)(x-c) = 0 ?
\((x-b)(x-a)+(x-b)(x-c)=(x-b)(2x-a-c)=0\)gives you roots \(x=b\ and\ x= \frac{a+c}{2} \), so sum of the roots would be \(b+ \frac{a+c}{2} \). This is a linear function of a, b, and c that needs to be maximized under certain constraints that we can write as
\(-9\leq a \leq 9\),
\(-9 \leq b \leq 9\), and
\(-9 \leq c \leq 9\),
since a, b, and c are one-digit integers. This is essentially a linear programming problem and it is known that the maximum of a linear function of a, b, and c , would occur at one of the corner points of the feasible region, in this case described by the above inequalities. The region is a cube with corner points (-9, -9, -9), (-9, -9, 9), (-9, 9, -9), (9, -9, -9), (9, 9, -9), (9, -9, 9), (-9, 9, 9), and (9, 9, 9). It is clear that the max value would be at (9, 9, 9), but we are looking for distinct integers. So we take the next largest positive integers, that is 7, 8, and 9. The largest value of \(b+ \frac{a+c}{2} \) is attained for a=7, b=9 and c=8, or a = 8, b = 9, and c = 7 and it is \( \frac{15}{2} +9=16.5\).