Sunday, 15 January 2012

Using Jacobi equation to find particular solutions of certain second-order ODE

The Jacobi equation for a functional serving as a test for weak local extremum can be derived in quite different ways. The following geometrical approach was found in the book "Calculus of Variations" by L.E. Elsgoltz, 1958. It hangs upon the question of whether or not the stationary path can be included in a part of a one-parametric family of stationary paths (field of paths). There can be 2 options. Either no 2 paths of the family intersect, or all patghs of the family share 1 common point (but not more) in the given interval.
For example \(y=C\sin{x}\) will form a family of the first type in \((\delta,a)\), where \(\delta>0\), \(a<\pi\), the family of the second type in \((0,a)\), \(a<\pi\). In the interval \((0,a)\), \(a\ge \pi\) no such family can be constructed.
Suppose we have a one-parametric family of stationary paths $$y=y(x,C)$$. For example, we can fix one of the boundary points and use the gradient of the paths in this point as parameter C.
The envelope of this family is found by eliminating C from the following system of equations:
$$y=y(x,C) \qquad \frac{\partial y(x,C)}{\partial C}=0\quad(*)$$
Along each path of the family \(\frac{\partial y(x,C)}{\partial C}\) is a function of only \(x\). Denote this function as \(u(x)\) for some given C. Then \(u_x'=\frac{\partial^2 y(x,C)}{\partial C \partial x}\).
\(y=y(x,C)\) are solutions of the E-L by assumption. Therefore:
Differentiating this equality by C and letting \(u=\frac{\partial y(x,C)}{\partial C}\) we obtain:
$$F_{yy}u+F_{yy'}u'-\frac{d}{dx}\left(F_{yy'}u+F_{y'y'}u' \right)=0$$
Rearranging we get:
$$\left(F_{yy}-\frac{d}{dx}F_{yy'}\right)u-\frac{d}{dx}\left(F_{y'y'}u' \right)=0$$
which is obviously a Jacobi equation.
Thus, if \(u\) has a zero somewhere in the interval, it follows from (*) above that this is a common point of the stationary path and the envelope. This is a point, conjugate to the left end of the interval.
It seemed to me at first that this proof serves only for theoretical purpose, as another way of deriving the Jacobi equation. However, the idea behind can be used to find the solution of the Jacobi equation, without actually solving the equation itself!
Consider the following example.
$$S[y]=\int_{0}^{1}\frac{y'^{2}}{y^{4}}dx \quad y(0)=1; \quad y(1)=\frac{1}{2}$$
(we can suppress \(\pm\), assuming it is absorbed by the constant)
Now apply boundary conditions:
$$y(0)=1\qquad y(1)=\frac{1}{2}$$
If we only use the first condition, that is fix the left boundary, we get the one-parametric family:
C being the gradient at \(x=0\) with a reverse sign. Now following the idea described above we can find \(u(x)\):
$$u(x)=\frac{\partial y(x,C)}{\partial C}=-\frac{x}{\left(Cx+1\right)^{2}}$$
Finally setting \(C=1\) by virtue of the boundary conditions:
Now we move on to dervie the Jacobi equation through the coefficients of the second variation:
$$P(x)=\frac{\partial^{2}F}{\partial y'^{2}}=\frac{2}{y^{4}}=2(x+1)^{4}$$

$$Q(x)=\frac{\partial^{2}F}{\partial y^{2}}-\frac{d}{dx}\left(\frac{\partial^{2}F}{\partial y\partial y'}\right)=20\frac{y'^{2}}{y^{6}}+8\frac{d}{dx}\left(\frac{y'}{y^{5}}\right)=$$

Inserting the above results into the equation:
Instead of solving the equation which can be technically demanding, we shall verify if the expression for \(u(x)\) found above is a solution. We do not need a general solution in this case. All non-trivial solutions of a homogeneous equation of the second order satisfying the condition \(u(x_0)=0\) differ from each other only by a constant multiplier and thus have the same zeros.
So we indeed have a solution.
For the integrand does not depend explicitly on \(x\) we can simplify the Jacobi equation by exchanging the roles of the variables:
Euler-Lagrange equation has the first integral:
Again we leave one arbitrary constant to form a family:
$$u(y)=\frac{\partial x(y,C)}{\partial C}=\frac{1}{y}-1$$
$$P(y)=\frac{\partial^{2}G}{\partial x'^{2}}=-\frac{2}{x'^{3}y^{4}}=-\frac{2}{-\frac{1}{y^{6}}y^{4}}=2y^{2}$$
Since the transformed integrand does not depend explicitly on the dependent variable, \(Q\) will vanish and the Jacobi equation has the first integral:
Thus, up to the sign we get the same expression.

No comments:

Post a Comment