Doubtnut Promotions Sticky

The least integral value of 'a' for which the equation x2+2(a1)x+(2a+1)=0 has both the roots positive, is

A

3

B

4

C

1

D

5

Video Solution
Text Solution
Generated By DoubtnutGPT

The correct Answer is:B


To find the least integral value of a for which the equation

x2+2(a1)x+(2a+1)=0

has both roots positive, we will follow these steps:

Step 1: Identify the conditions for positive roots
For the quadratic equation ax2+bx+c=0 to have both roots positive, we need to satisfy three conditions:
1. The discriminant D must be greater than or equal to 0.
2. The x-coordinate of the vertex must be positive.
3. The value of the function at x=0 must be positive.

Step 2: Calculate the discriminant
The discriminant D is given by:

D=b24ac

For our equation, a=1, b=2(a1), and c=2a+1.

Calculating D:

D=(2(a1))241(2a+1)
D=4(a1)24(2a+1)
D=4((a1)2(2a+1))
D=4(a22a+12a1)
D=4(a24a)
D=4a(a4)

For the roots to be real, we need D0:

4a(a4)0

This gives us the intervals:

a0ora4

Step 3: Find the x-coordinate of the vertex
The x-coordinate of the vertex for a quadratic equation is given by:

x=b2a

For our equation, this becomes:

x=2(a1)21=(a1)=1a

For the vertex to be positive:

1a>0a<1

Step 4: Evaluate f(0)
Next, we need to ensure that f(0)>0:

f(0)=2a+1>0
2a>1a>12

Step 5: Combine the conditions
Now we combine the conditions derived from the three steps:

1. From the discriminant: a0 or a4
2. From the vertex: a<1
3. From f(0): a>12

The only feasible solution that satisfies all conditions is:

a4

Conclusion
The least integral value of a that satisfies all conditions is:

4

|


Topper's Solved these Questions


Similar Questions