1
2\newcommand{\definedas}{\stackrel{\triangle}{=}}
3
4\newcommand{\changevareg}[1]{\begin{quote}{\tt #1} \end{quote}}
5
6
7\subsection{Introduction}
8The mathematics behind the change of independent variable(s) in differential
9equations is quite straightforward. It is basically the application of the
10chain rule. If the dependent variable of the differential equation is $F$,
11the independent variables are $x_{i}$ and the new independent variables are
12$u_{i}$ (where ${\scriptstyle i=1\ldots n}$) then the first derivatives are:
13\[
14    \frac{\partial F}{\partial x_{i}} = \frac{\partial F}{\partial u_{j}}
15                                        \frac{\partial u_{j}}{\partial x_{i}}
16\]
17We assumed Einstein's summation convention. Here the problem is to
18calculate the $\partial u_{j}/\partial x_{i}$ terms if the change of variables
19is given by
20\[
21    x_{i} = f_{i}(u_{1},\ldots,u_{n})
22\]
23The first thought might be solving the above given equations for $u_{j}$ and
24then differentiating them with respect to $x_{i}$, then again making use of the
25equations above, substituting new variables for the old ones in the calculated
26derivatives. This is not always a  preferable way to proceed. Mainly because
27the functions $f_{i}$ may not always be easily invertible. Another approach
28that makes use of the Jacobian is better. Consider the above given equations
29which relate the old variables to the new ones. Let us differentiate them:
30\begin{eqnarray*}
31  \frac{\partial x_{j}}{\partial x_{i}} & = &
32        \frac{\partial f_{j}}{\partial x_{i}}   \\
33  \delta_{ij} & = &
34        \frac{\partial f_{j}}{\partial u_{k}}
35        \frac{\partial u_{k}}{\partial x_{i}}
36\end{eqnarray*}
37The first derivative is nothing but the $(j,k)$ th entry of the Jacobian matrix.
38
39So if we speak in matrix language
40\[ {\bf 1 = J \cdot D} \]
41where we defined the Jacobian
42\[ {\bf J}_{ij} \definedas  \frac{\partial f_{i}}{\partial u_{j}} \]
43and the matrix of the derivatives we wanted to obtain as
44\[ {\bf D}_{ij} \definedas  \frac{\partial u_{i}}{\partial x_{j}}. \]
45If the Jacobian has a non-vanishing determinant then it is invertible and
46we are able to write from the matrix equation above:
47\[ {\bf  D = J^{-1}} \]
48so finally we have what we want
49\[
50   \frac{\partial u_{i}}{\partial x_{j}} = \left[{\bf J^{-1}}\right]_{ij}
51\]
52
53The higher derivatives are obtained by the successive application of the chain
54rule and using the definitions of the old variables in terms of the new ones. It
55
56can be easily verified that the only derivatives that are needed to be
57calculated are the first order ones which are obtained above.
58
59\subsection{How to Use CHANGEVR}
60
61First load {\tt CHANGEVR} by the statement:
62\changevareg{LOAD\_PACKAGE CHANGEVR\$}
63\ttindextype{CHANGEVAR}{operator}
64Now the REDUCE function {\tt CHANGEVAR} is ready to use. {\bf Note:  The
65package is named CHANGEVR, but the function has the name CHANGEVAR}.  The
66function \f{CHANGEVAR} has (at least) four different arguments.  Here we
67give a list them:
68\begin{itemize}
69\item {\bf FIRST ARGUMENT} \\
70     Is a list of the dependent variables of the differential equation.
71     They shall be enclosed in a pair of curly braces and separated by commas.
72     If there is only one dependent variable there is no need for the curly
73     braces.
74\item {\bf SECOND ARGUMENT}  \\
75     Is a list of the {\bf new} independent variables. Similar to what is said
76     for the first argument, these shall also be separated by commas,
77     enclosed in curly braces and the curly braces can be omitted if there is
78     only one new variable.
79\item {\bf THIRD ARGUMENT}  \\
80     Is a list of equations separated by commas, where each of the equation
81     is of the form
82      \changevareg{{\em old variable} = {\em a function in new variables}}
83     The left hand side cannot be a non-kernel structure. In this argument
84     the functions which give the old variables in terms of the new ones are
85     introduced. It is possible to omit totally the curly braces which enclose
86     the list. {\bf Please note that only for this argument it is allowed to
87     omit the curly braces even if the list has \underline{more than one}
88     items}.
89\item {\bf LAST ARGUMENT}  \\
90     Is a list of algebraic expressions which evaluates to  differential
91     equations, separated by commas, enclosed in curly braces.
92     So, variables in which differential equations are already stored may be
93     used freely. Again it is possible to omit the curly braces if there is
94     only {\bf one} differential equation.
95\end{itemize}
96
97If the last argument is a list then the result of {\tt CHANGEVAR} is also a
98list.
99
100It is possible to display the entries of the inverse Jacobian, explained
101in the introduction.  To do so, turn {\tt ON} the flag {DISPJACOBIAN} by a
102statement: \changevareg{ON DISPJACOBIAN;}\ttindextype{DISPJACOBIAN}{switch}
103
104\subsection{AN EXAMPLE\ldots\ldots The 2-dim. Laplace Equation}
105The 2-dimensional Laplace equation in cartesian coordinates is:
106\[
107   \frac{\partial^{2} u}{\partial x^{2}} +
108   \frac{\partial^{2} u}{\partial y^{2}} = 0
109\]
110Now assume we want to obtain the polar coordinate form of Laplace equation.
111The change of variables is:
112\[
113   x = r \cos \theta, \qquad  y = r \sin \theta
114\]
115The solution using {\tt CHANGEVAR}  (of course after it is properly loaded)
116is as follows
117\changevareg{CHANGEVAR(\{u\},\{r,theta\},\{x=r*cos theta,y=r*sin theta\}, \\
118    \hspace*{2cm}     \{df(u(x,y),x,2)+df(u(x,y),y,2)\} )}
119Here we could omit the curly braces in the first and last arguments (because
120those lists have only one member) and the curly braces in the third argument
121(because they are optional), but you cannot leave off the curly braces in the
122second argument. So one could equivalently write
123\changevareg{CHANGEVAR(u,\{r,theta\},x=r*cos theta,y=r*sin theta,        \\
124    \hspace*{2cm}     df(u(x,y),x,2)+df(u(x,y),y,2) )}
125
126If you have tried out the above example, you will notice that the denominator
127contains a $\cos^{2} \theta + \sin^{2} \theta$ which is actually equal to $1$.
128This has of course nothing to do with the {\tt CHANGEVAR} facility introduced
129here.  One has to be overcome these pattern matching problems by the
130conventional methods REDUCE provides (a {\tt LET} statement, for example,
131will fix it).
132
133Secondly you will notice that your {\tt u(x,y)} operator has changed to
134{\tt u(r,theta)} in the result. Nothing magical  about this. That is just what
135we do with pencil and paper. {\tt u(r,theta)} represents the  the transformed
136dependent variable.
137
138\subsection{ANOTHER EXAMPLE\ldots\ldots An Euler Equation}
139Consider a differential equation which is of Euler type, for instance:
140\[
141   x^{3}y''' - 3 x^{2}y'' + 6 x y' - 6 y = 0
142\]
143Where prime denotes differentiation with respect to $x$. As is well known,
144Euler type of equations are solved by a change of variable:
145\[
146   x = e^{u}
147\]
148So our  {\tt CHANGEVAR} call reads as follows:
149\changevareg{CHANGEVAR(y, u, x=e**u, x**3*df(y(x),x,3)-   \\
150    \hspace*{2cm}   3*x**2*df(y(x),x,2)+6*x*df(y(x),x)-6*y(x))}
151
152
153
154
155