Show that the equation 2(a2+b2)x2+2(a+b)x+1=0 has no real roots when ab

Video Solution
Text Solution
Verified by Experts

D=4(a+b)28(a2+b2)<br>=4{(a+b)^(2)-2(a^(2)+b^(2))}=4(-a^(2)-b^(2)+2ab)<br>=-4(a^(2)+b^(2)-2ab)=-4(a-b)^(2)lt0.

|


Similar Questions


Knowledge Check

  • Suppose a,b R. If the equation x2(2a+b)x+(2a2+b2b+1/2)=0 has two real roots, then

    Aa=1/2,b=1
    Ba=2,b=1
    Ca=2,b=1
    Da=1/2,b=1
  • If a,bR , and the equation x2+(ab)xab+1=0 has real roots for all bR, then a lies in the in terval

    A(1,)
    B(0,)
    C(,1)
    D(1,1)
  • For a, b,c non-zero, real distinct, the equation, (a2+b2)x22b(a+c)x+b2+c2=0 has non-zero real roots. One of these roots is also the root of the equation :

    A(b2c2)x2+2a(bc)xa2=0
    B(b2+c2)x22a(b+c)x+a2=0
    Ca2x2+a(cb)xbc=0
    Da2x2a(bc)x+bc=0