Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Show that if \(L[y]=x^{2} y^{\prime \prime}+\alpha x y^{\prime}+\beta y,\) then $$ L\left[(-x)^{r}\right]=(-x)^{r} F(r) $$ for all \(x<0,\) where \(F(r)=r(r-1)+\alpha r+\beta .\) Hence conclude that if \(r_{1} \neq r_{2}\) are roots of \(F(r)=0,\) then linearly independent solutions of \(L[y]=0\) for \(x<0\) are \((-x)^{r_{1}}\) and \((-x)^{r_{2}}\)

Short Answer

Expert verified
Question: Show that if \(L[(-x)^r] = (-x)^r F(r)\), where the linear second-order homogeneous differential operator \(L[y] = x^2 y'' + \alpha x y' + \beta y\), then two functions \((-x)^{r_1}\) and \((-x)^{r_2}\) are linearly independent solutions for \(x<0\), where \(r_1\) and \(r_2\) are distinct roots. Solution: Let \(F(r)=r(r-1)+\alpha r+\beta\). Then, when \(F(r_1)=0\) and \(F(r_2)=0\), we have two linearly independent solutions \((-x)^{r_1}\) and \((-x)^{r_2}\) for the equation \(L[y]=0\) for \(x<0\) as their Wronskian determinant is non-zero.

Step by step solution

01

Derive \((-x)^r\) and \((-x)^r\)

Firstly, we need to find the first and second derivatives of the function \((-x)^r\) with respect to \(x\). Using the chain rule, we find: $$ \frac{d}{dx}\left[(-x)^r\right] = r(-x)^{r-1}(-1), $$ and, $$ \frac{d^2}{dx^2}\left[(-x)^r\right] = r(r-1)(-x)^{r-2}(-1)^2. $$
02

Plug the derivatives into \(L[y]\)

We now plug these derivatives into the given equation for \(L[y]\), which is: $$ L[y] = x^2 y'' + \alpha x y' + \beta y. $$ Substituting the derivatives, we get: $$ L[(-x)^r] = x^2 r(r-1)(-x)^{r-2}(-1)^2 + \alpha x r(-x)^{r-1}(-1) + \beta(-x)^r. $$
03

Simplify the expression

Now, we simplify the expression as follows: $$ L[(-x)^r] = (-x)^r (r(r-1) + \alpha r + \beta). $$ Comparing this to the given result, $$ (-x)^r F(r) = L[(-x)^r], $$ where \(F(r) = r(r-1) + \alpha r + \beta\).
04

Determine the roots of \(F(r)=0\)

To find the roots \(r_1\) and \(r_2\), we need to solve the equation \(F(r) = 0\). This gives us: $$ r(r-1) + \alpha r + \beta = 0. $$
05

Verify linearly independent solutions

If \(r_1\) and \(r_2\) are distinct roots of \(F(r) = 0\), then the functions \((-x)^{r_1}\) and \((-x)^{r_2}\) are solutions of the equation \(L[y]=0\). To show that they are linearly independent for \(x<0\), we will use the Wronskian determinant. The Wronskian determinant for these two functions is: $$ W\left((-x)^{r_1}, (-x)^{r_2}\right) = \begin{vmatrix} (-x)^{r_1} & (-x)^{r_2} \\ r_1(-x)^{r_1 - 1}(-1) & r_2(-x)^{r_2-1}(-1) \end{vmatrix} $$ Upon evaluating the determinant, we get: $$ W\left((-x)^{r_1}, (-x)^{r_2}\right) = r_1(-x)^{r_1 - 1}(-x)^{r_2} - r_2(-x)^{r_1}(-x)^{r_2-1}, $$ $$ W\left((-x)^{r_1}, (-x)^{r_2}\right) = (-x)^{r_1 + r_2 - 1}(r_1 - r_2). $$ Since \(r_1\) and \(r_2\) are distinct, \(r_1 - r_2 \neq 0\), and \(W((-x)^{r_1}, (-x)^{r_2}) \neq 0\) for \(x < 0\). This means that \((-x)^{r_1}\) and \((-x)^{r_2}\) are linearly independent solutions of \(L[y]=0\) for \(x < 0\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

It can be shown that \(J_{0}\) has infinitely many zeros for \(x>0 .\) In particular, the first three zeros are approximately \(2.405,5.520, \text { and } 8.653 \text { (see figure } 5.8 .1) .\) Let \(\lambda_{j}, j=1,2,3, \ldots,\) denote the zeros of \(J_{0}\) it follows that $$ J_{0}\left(\lambda_{j} x\right)=\left\\{\begin{array}{ll}{1,} & {x=0} \\ {0,} & {x=1}\end{array}\right. $$ Verify that \(y=J_{0}(\lambda, x)\) satisfies the differential equation $$ y^{\prime \prime}+\frac{1}{x} y^{\prime}+\lambda_{j}^{2} y=0, \quad x>0 $$ Ilence show that $$ \int_{0}^{1} x J_{0}\left(\lambda_{i} x\right) J_{0}\left(\lambda_{j} x\right) d x=0 \quad \text { if } \quad \lambda_{i} \neq \lambda_{j} $$ This important property of \(J_{0}\left(\lambda_{i} x\right),\) known as the orthogonality property, is useful in solving boundary value problems. Hint: Write the differential equation for \(J_{0}(\lambda, x)\). Multiply it by \(x J_{0}\left(\lambda_{y} x\right)\) and subtract it from \(x J_{0}\left(\lambda_{t} x\right)\) times the differential equation for \(J_{0}(\lambda, x)\). Then integrate from 0 to \(1 .\)

Show that the given differential equation has a regular singular point at \(x=0,\) and determine two linearly independent solutions for \(x>0 .\) $$ x^{2} y^{\prime \prime}+x y^{\prime}+2 x y=0 $$

The definitions of an ordinary point and a regular singular point given in the preceding sections apply only if the point \(x_{0}\) is finite. In more advanced work in differential equations it is often necessary to discuss the point at infinity. This is done by making the change of variable \(\xi=1 / x\) and studying the resulting equation at \(\xi=0 .\) Show that for the differential equation \(P(x) y^{\prime \prime}+Q(x) y^{\prime}+R(x) y=0\) the point at infinity is an ordinary point if $$ \frac{1}{P(1 / \xi)}\left[\frac{2 P(1 / \xi)}{\xi}-\frac{Q(1 / \xi)}{\xi^{2}}\right] \quad \text { and } \quad \frac{R(1 / \xi)}{\xi^{4} P(1 / \xi)} $$ have Taylor series expansions about \(\xi=0 .\) Show also that the point at infinity is a regular singular point if at least one of the above functions does not have a Taylor series expansion, but both \(\frac{\xi}{P(1 / \xi)}\left[\frac{2 P(1 / \xi)}{\xi}-\frac{Q(1 / \xi)}{\xi^{2}}\right] \quad\) and \(\quad \frac{R(1 / \xi)}{\xi^{2} P(1 / \xi)}\) do have such expansions.

Find all the regular singular points of the given differential equation. Determine the indicial equation and the exponents at the singularity for each regular singular point. \(x^{2} y^{\prime \prime}+3(\sin x) y^{\prime}-2 y=0\)

Suppose that \(x^{r}_{1}\) and \(x^{r_{2}}\) are solutions of an Euler equation for \(x>0,\) where \(r_{1} \neq r_{2},\) and \(r_{1}\) is an integer. According to Eq. ( 24) the general solution in any interval not containing the origin is \(y=c_{1}|x|^{r_{1}}+c_{2}|x|^{r_{2}} .\) Show that the general solution can also be written as \(y=k_{1} x^{r}_{1}+k_{2}|x|^{r_{2}} .\) Hint: Show by a proper choice of constants that the expressions are identical for \(x>0,\) and by a different choice of constants that they are identical for \(x<0 .\)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free