What first degree polynomial added to 3x3 − 2x2 gives a polynomial for which both x − 1 and x + 1 are factors?
Let the first degree polynomial that has to be added to the polynomial 3x3− 2x2 be ax + b.
So, the polynomial is 3x3− 2x2 + ax + b.
Divisor = (x − 1)
Factor theorem says that for the polynomial p(x) and for the number a, if we have p(a) = 0, then (x − a) is a factor of p(x).
Thus, we must have:
3(1)3 − 2(1)2 + a(1) + b = 0 for (x − 1) to be a factor of 3x3− 2x2 + ax + b.
⇒ 3 − 2 + a + b = 0
⇒ a + b = −1 …(1)
Similarly, checking for (x + 1).
(x + 1) = {x − (−1)}
We must have:
3(−1)3 − 2(−1)2 + a(−1) + b = 0 for (x + 1) to be a factor of 3x3− 2x2 + ax + b.
⇒ −3 − 2 − a + b = 0
⇒ b − a = 5 …(2)
Adding (1) and (2):
2b = 4
⇒ b = 2
a = −3
Therefore, the first degree expression that has to be added to 3x3− 2x2 is −3x + 2, so that the polynomial obtained has both (x + 1) and (x − 1) as the factors.