If f(x) is a function defined only for 0 is less than or equal to x is less than or equal to 1, and f(x) = ax+b for constants a and b where a < 0, then what is the range of f in terms of a and b? Express your answer in interval notation
x is between 0 and 1.
Since a < 0,
The biggest possible f(x) will occur when x = 0
f(0) = b
The smallest possible f(x) will occur when x = 1
f(1) = a + b
So f is between a+b and b.
[a+b, b]