+0

# Algebraic Identities

0
51
1

Let a and b with a > b > 0 be real numbers satisfying a^3 + b^3 = a + b. Find $\frac{(a + b)^2 - 1}{ab}$ .

Jun 21, 2021

#1
+26112
+2

Let a and b with a > b > 0 be real numbers satisfying $$a^3 + b^3 = a + b$$.
Find $$\dfrac{(a + b)^2 - 1}{ab}$$.

$$Formula: \boxed{a^3+b^3=(a+b)(a^2-ab+b^2)}$$

$$\begin{array}{|rcll|} \hline a^3+b^3 &=& (a+b)(a^2-ab+b^2) \quad | \quad a^3 + b^3 = a + b \\ (a+b) &=& (a+b)(a^2-ab+b^2) \\ a^2-ab+b^2 &=& \dfrac{a+b}{a+b} \\ a^2-ab+b^2 &=& 1 \\ a^2+b^2-ab &=& 1 \quad | \quad a^2+b^2=(a+b)^2-2ab \\ (a+b)^2-2ab-ab &=& 1 \\ (a+b)^2-3ab &=& 1 \\ (a+b)^2-1 &=& 3ab \quad | \quad :ab \\\\ \mathbf{ \dfrac{ (a+b)^2-1 }{ab} } &=& \mathbf{ 3 } \\ \hline \end{array}$$

Jun 21, 2021