Math 555: Differential Equations I




Lecture Companion

§2.8: Picard's Iterative Method




Suppose we wish to solve an initial value problem (IVP) of the form

$$ \begin{cases} \frac{dy}{dt} = f(t,y), \\[0.5 ex] y(0) = 0. \end{cases} \tag{1} $$
The first step is to apply the Fundamental Existence and Uniqueness Theorem (FEUT) to verify that a solution indeed exists.

Recall that for a nonlinear equation, the FEUT states that a unique solution $y = y(t)$ exists on some interval $-\varepsilon < t < \varepsilon$, $\varepsilon > 0$, provided the functions $f$ and $\partial f/\partial y$ are continuous on a rectangle containing the initial point $(0,0)$. If this criteria is satisfied, then it is possible to find the solution curve passing through $(0,0)$ as an implicit function $y = y(t)$.

Now that we know it will not be a waste of our energy to try to find a solution, we can use Picard's iterative method to find it, as follows.

The FEUT allows us to assume that $y$ is a function of $t$ in an interval containing $t=0$. Thus the right-hand side of equation $(1)$ may be regarded as $f(t,y(t))$, a function of the single variable $t$. Multiplying through by $dt$ and integrating, we see that any solution $\varphi$ of the differential equation $(1)$ must also solve the integral equation

$$ y(t) = \int_0^t f\big(s,y(s)\big)\, ds. \tag{2} $$
Indeed, the Fundamental Theorem of Calculus tells us that differentiating equation $(2)$ implicitly yields $y' = f(t,y)$.

Notice that the constant function $\varphi_0(t) = 0$ satisfies the initial condition of the IVP $(1)$. In general, however, this simple guess will not be a solution of the differential equation $(1)$.

The function $\varphi_0$ may be plugged into the right-hand side of the integral equation $(2)$ and integrated to obtain a new function,

$$ \varphi_1(t) = \int_0^t f\big(s,\varphi_0(s)\big)\, ds. \tag{3} $$
This function $\varphi_1$ again obeys the initial condition of the IVP $(1)$. And again, it is probably not a solution to the differential equation. But the act of integrating generally makes things nicer, and $\varphi_1$ is "closer" to being a solution of the differential equation than $\varphi_0$ was.

Picard's method is to iterate this process and obtain a sequence of functions $\varphi_n$ which approach the unique solution $\varphi$ of the IVP as $n$ tends to $\infty$.

$$ \varphi_n(t) = \int_0^t f\big(s,\varphi_{n-1}(s)\big)\, ds \tag{4} $$
$$ \varphi(t) = \lim_{n \to \infty} \varphi_n(t) \tag{5} $$
The precise details of why this procedure works to give the desired unique solution are left for a future course.

We look at some examples below.




Example 1. First, we write a Sage function to perform $n$ iterations of the Picard method on an IVP with initial value $(0,0)$.

This function will be used in the rest of the examples on this page, so the ${\tt SageCell}$ below must be run before the other cells in the page.



We test this function with the initial value problem $$ \begin{cases} y' = t-y, \\ y(0) = 0. \end{cases} $$



Now we plot each approximation on the same set of axes.



None of the curves in this graph are solutions to the initial value problem. The red curve is the "best approximation" among the 5 curves plotted.




Example 2. Consider the initial value problem

$$ \begin{cases} \frac{dy}{dt} = 2y, \\[0.5 ex] y(0) = 1. \end{cases} \tag{6} $$
Notice that the initial data for this DE is not $(0,0)$, thus our first piece of business is the "shift the problem to the origin" via a change of coordinates. For this problem, $t_0 = 0$, but $y_0 \neq 0$, so we need only shift the $y$-values. We make the change of variables

$$ u = y-1. $$
Then $u(0) = 0$, $du/dt = dy/dt$, and $y = u + 1$. The DE $(6)$ becomes

$$ \begin{cases} \frac{du}{dt} =2(u + 1), \\[0.5 ex] u(0) = 0. \end{cases} \tag{7} $$
We may use the Picard iterative method on the IVP $(7)$.



These $\varphi_i$ are approximate solutions of the differential equation $(7)$ for $u$. The approximate solutions to the original differential equation for $y$ are $\psi_i(t) = \varphi_i(t) + 1$, since $y = u + 1$.



We plot the slope field and the approximate solutions. Remember, none of the $\psi_i$ are actually solution curves to the differential equation. The graphs approximate the solution curve near the initial point.





Example 3. Consider the initial value problem

$$ \begin{cases} \frac{dy}{dt} = t^2 + y^2, \\[0.5 ex] y(1) = 2. \end{cases} \tag{8} $$
Again, we must shift the initial value to use the picard function. Let $s = t - 1$ and $u = y - 2$. The initial value problem becomes

$$ \begin{cases} \frac{du}{ds} = (s + 1)^2 + (u + 2)^2, \\[0.5 ex] u(0) = 0. \end{cases} \tag{9} $$
We may now use our picard function to compute the approximate solutions as we did in the previous examples.



We print the functions $\psi_i$--if for no other reason, as evidence of why we'd prefer not to do this one by hand.



Finally, we look at the slope field with the plots of the function $\psi_i$.






Back to main page


Your use of Wichita State University content and this material is subject to our Creative Common License.