Loading [MathJax]/jax/output/SVG/jax.js
 
+0  
 
0
264
1
avatar+99 

Consider the ellipse x2a2+y2b2=1, where a>b>0. As a function of a and b, find the radius of the smallest circle that contains the ellipse, is centered on the y-axis, and which intersects the ellipse only at (0,b).

 Aug 13, 2022
 #1
avatar+99 
+1

The answer, for anyone interested,

Let r be the radius of the circle. Then the center of the circle is (0,br), so the equation of the circle is x2+(y(br))2=r2.
Solving for x2 in x2a2+y2b2, we get x2=a2(b2y2)b2.
Substituting this into x2+(y(br))2=r2, we get a2(b2y2)b2+(y(br))2=r2.
This simplifies to (a2b2)y2+(2b32b2r)y+(2b3ra2b2b4)=0.
Since the ellipse and circle intersect at (0,b), y=b will be a root of the quadratic above. Thus, we can take out a factor of yb, which gives us (yb)((a2b2)y+a2b+b32b2r)=0.
The solution to (yb)((a2b2)y+a2b+b32b2r)=0 is y1=2b2ra2bb3a2b2.
If y1<b, then the ellipse and circle also intersect at a point where the y-coordinate is y1 (or the circle does not contain the ellipse). So, we need y1=2b2ra2bb3a2b2b.
Isolating r, we find ra2b.
Therefore, the smallest possible radius is a2b.

 

 

 Aug 18, 2022
edited by WorldEndSymphony  Aug 18, 2022

1 Online Users

avatar