Skip to main content
Chemistry LibreTexts

The stationary phase approximation

Consider the simple integral: 

\[ I = \lim_{\lambda\rightarrow\infty}\int_{-\infty}^{\infty}dx\;e^{-\lambda f(x)} \]

Assume \(f (x) \) has a global minimum at \(\underline {x = x_0} \), such that \(f' (x_0) = 0 \). If this minimum is well separated from other minima of \(f (x) \) and the value of \(f (x) \) at the global minimum is significantly lower than it is at other minima, then the dominant contributions to the above integral, as \(\lambda \rightarrow \infty \) will come from the integration region around \(\underline {x_0} \). Thus, we may expand \(f (x) \) about this point: 

\[ f(x) = f(x_0) + f'(x_0)(x-x_0) + {1 \over 2}f''(x_0)(x-x_0)^2 + \cdots \]

 

Since \(f' (x_0) = 0 \), this becomes: 

\[ f(x) \approx f(x_0) + {1 \over 2}f''(x_0)(x-x_0)^2 \]

 

Inserting the expansion into the expression for \(I\) gives 

 

\(I\) $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$

\(\lim_{\lambda\rightarrow\infty}e^{-\lambda f(x_0)}\int_{-\infty}^{\infty}dx\;e^{-{\lambda \over 2}f''(x_0)(x-x_0)^2}\)

 
  $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$

\(\lim_{\lambda\rightarrow\infty}\left[{2\pi \over \lambda f''(x_0)}\right]^{1/2}e^{-\lambda f(x_0)}\)

 

 

 

Corrections can be obtained by further expansion of higher order terms. For example, consider the expansion of \(f (x) \) up to fourth order: 

\[ f(x) \approx f(x_0) + {1 \over 2}f''(x_0)(x-x_0)^2 + {1 \over 6}f'''(x_0)(x-x_0)^3+ {1 \over 24}f^{(iv)}(x_0)(x-x_0)^4 \]

 

Substituting this into the integrand and further expanding the exponential would give, as the lowest order nonvanishing correction: 

\[ I = \lim _{\lambda \rightarrow \infty } e^{-\lambda f(x_0) } \int _{-\infty}^{\infty} dx e^{\frac {-\lambda}{2}} f'' (x_0) (x - x_0)^2 \left [ 1 - {\lambda \over 24} f^{(iv)} (x_0) (x - x_0 )^4 \right ] \]

This approximation is known as the stationary phase or saddle point approximation. The former may seem a little out-of-place, since there is no phase in the problem, but that is because we formulated it in such a way as to anticipate its application to the path integral. But this is only if \(\lambda \) is taken to be a real instead of an imaginary quantity.

The application to the path integral follows via a similar argument. Consider the path integral expression for the density matrix: 

\[ \rho(x,x';\beta) = \int_{x(0)=x}^{x(\beta\hbar)=x'}{\cal D}[x]e^{-S_{\rm E}[x]/\hbar} \]

We showed that the classical path satisfying 

 

$\displaystyle m\ddot{x}_{\rm cl}$ $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle \left.{\partial U \over \partial x}\right\vert _{x=x_{\rm cl}}$  
$\displaystyle x(0)$ $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle x\;\;\;\;\;\;\;\;\;\;x(\beta\hbar)=x'$  

 

 

is a stationary point of the Euclidean action $S_{\rm E}[x]$, i.e., $\delta S_{\rm E}[x_{\rm cl}]=0$. Thus, we can develop a stationary phase or saddle point approximation for the density matrix by introducing an expansion about the classical path according to 

 

 

\begin{displaymath}
x(\tau) = x_{\rm cl}(\tau) + y(\tau) = x_{\rm cl}(\tau) + \sum_{n}c_n\phi_n(\tau)
\end{displaymath}

 

 

where the correction $y(\tau)$, satisfying $y(0)=y(\beta\hbar)=0$ has been expanded in a complete set of orthonormal functions $\{\phi_n(\tau)\}$, which are orthonormal on the interval $[0,\beta\hbar]$ andsatisfy $\phi_n(0)=\phi_n(\beta\hbar)=0$ as well as the orthogonality condition: 

 

 

\begin{displaymath}
\int_0^{\beta\hbar}\;d\tau\;\phi_n(\tau)\phi_m(\tau) = \delta_{mn}
\end{displaymath}

 

 

Setting all the expansion coefficients to 0 recovers the classical path. Thus, we may expand the action $S[x]$ (the ``E'' subscript will henceforth be dropped from this discussion) with respect to the expansion coefficients: 

 

 

\begin{displaymath}
S[x] = S[x_{\rm cl}] + \sum_j \left.{\partial S \over \parti...
...S \over \partial c_j c_k}\right\vert _{\{c\}=0}c_jc_k
+ \cdots
\end{displaymath}

 

 

Since 

 

 

\begin{displaymath}
S[x] = \int_0^{\beta\hbar}d\tau\left[{1 \over 2}m\dot{x}^2 + U(x(\tau))\right]
\end{displaymath}

 

 

the expansion can be worked out straightforwardly by substitution and subsequent differentiation: 

 

$\displaystyle S[x]$ $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle \int_0^{\beta\hbar}d\tau\left[{1 \over 2}m\left(\dot{x}_{\rm cl} +
\sum_n c_n\dot{\phi}_n\right)^2 + U(x_{\rm cl} + \sum_n c_n\phi_n)\right]$  
       
$\displaystyle {\partial S \over \partial c_j}$ $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle \int_0^{\beta\hbar}d\tau\left[m(\dot{x}_{\rm cl} + \sum_n c_n\dot{\phi}_n)\dot{\phi}_j
+ U'\left(x_{\rm cl} + \sum_n c_n\phi_n\right)\phi_j\right]$  
$\displaystyle \left.{\partial S \over \partial c_j}\right\vert _{\{c\}=0}$ $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle \int_0^{\beta\hbar}d\tau\left[m\dot{x}_{\rm cl}\dot{\phi}_j +
U'(x_{\rm cl})\phi_j\right]$  
  $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle \left. m\dot{x}_{\rm cl}\phi_j\right\vert _{0}^{\beta\hbar} +
\int_0^{\beta\hbar}d\tau \left[-m\ddot{x}_{\rm cl} + U'(x_{\rm cl})\right]\phi_j$  
  $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle 0$  
       
$\displaystyle {\partial^2 S \over \partial c_j\partial c_k}$ $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle \int_0^{\beta\hbar}\left[m\dot{\phi}_j\dot{\phi}_k +
U''\left(x_{\rm cl} + \sum_n c_n\phi_n]\right)
\phi_j\phi_k\right]$  
$\displaystyle \left.{\partial^2 S \over \partial c_j\partial c_k}\right\vert _{\{c\}=0}$ $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle \int_0^{\beta\hbar}d\tau\left[m\dot{\phi}_j\dot{\phi}_k + U''(x_{\rm cl})\phi_j\phi_k\right]$  
  $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle \int_0^{\beta\hbar}d\tau\left[-m\phi_j\ddot{\phi}_k + U''(x_{\rm cl}(\tau))\phi_j\phi_k
\right]$  
  $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle \int_0^{\beta\hbar}d\tau \phi_j(\tau)\left[-m{d^2 \over d\tau^2} +
U''(x_{\rm cl}(\tau))\right]\phi_k(\tau)$  

 

 

where the fourth and eighth lines are obtained from an integration by parts. Let us write the integral in the last line in the suggestive form: 

 

 

\begin{displaymath}
\left.{\partial^2 S \over \partial c_j\partial c_k}\right\ve...
...au^2} + U''(x_{\rm cl}(\tau))\vert\phi_k\rangle
= \Delta_{jk}
\end{displaymath}

 

 

which emphasizes the fact that we have matrix elements of the operator $-md^2/d\tau^2 + U''(x_{\rm cl}(\tau))$ with respect to the basis functions. Thus, the expansion for $S$ can be written as 

 

 

\begin{displaymath}
S[x] = S[x_{\rm cl}] + {1 \over 2}\sum_{j,k}c_j\Delta_{jk}c_k + \cdots
\end{displaymath}

 

 

and the density matrix becomes 

 

 

\begin{displaymath}
\rho(x,x';\beta) = {\cal N}\int \prod_j {dc_j \over \sqrt{2\...
...x,x';\beta)} e^{-{1 \over 2}\sum_{j,k}c_j\Delta_{jk}c_k/\hbar}
\end{displaymath}

 

 

where $S_{\rm cl}(x,x';\beta)=S[x_{\rm cl}]$${\cal N}$ is an overall normalization constant. The integral over the coefficients becomes a generalized Gaussian integral, which brings down a factor of $1/\sqrt{{\rm det}\Delta}$

 

$\displaystyle \rho(x,x';\beta)$ $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle {\cal N}e^{-S_{\rm cl}(x,x';\beta)}{1 \over \sqrt{{\rm det}\Delta}}$  
  $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle {\cal N}e^{-S_{\rm cl}(x,x';\beta)}
{1 \over \sqrt{{\rm det}
\left(-m{d^2 \over d\tau^2} + U''(x_{\rm cl}(\tau))\right)}}$  

 

 

where the last line is the abstract representation of the determinant. The determinant is called the 

Van Vleck-Pauli-Morette determinant.

If we choose the basis functions $\phi_n(\tau)$ to be eigenfunctions of the operator appearing in the above expression, so that they satisfy 

 

\begin{displaymath}
\left[-m{d^2 \over d\tau^2} + U''(x_{\rm cl}(\tau))\right]\phi_n(\tau) = \lambda_n
\phi_n(\tau)
\end{displaymath}

 

 

Then, 

 

 

\begin{displaymath}
\Delta_{jk} = \lambda_j \delta_{jk} = \lambda_j(x,x';\beta)\delta_{jk}
\end{displaymath}

 

 

and the determinant can be expressed as a product of the eigenvalues. Thus, 

 

 

\begin{displaymath}
\rho(x,x';\beta) = {\cal N}e^{-S_{\rm cl}(x,x';\beta)}\prod_j {1 \over \sqrt{
\lambda_j(x,x';\beta)}}
\end{displaymath}

 

 

The product must exclude any 0-eigenvalues.

Incidentally, by performing a Wick rotation back to real time according to $\beta = -it/\hbar$, the saddle point or stationary phase approximation to the real-time propagator can be derived. The derivation is somewhat tedious and will not be given in detail here, but the result is 

 

\begin{displaymath}
U(x,x';t) = e^{{i \over \hbar}S_{\rm cl}(x,x';t)}
{1 \over \...
...-m{d^2 \over dt^2} - U''(x_{\rm cl}(t))\right)}}e^{-i\pi\nu/2}
\end{displaymath}

 

 

where $x_{\rm cl}(t)$ satisfies 

 

$\displaystyle m\ddot{x}_{\rm cl}$ $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle -\left.{\partial U \over \partial x}\right\vert _{x=x_{\rm cl}}$  
$\displaystyle x_{\rm cl}(0)$ $\textstyle =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ data-cke-saved-style =$ $\displaystyle x\;\;\;\;\;\;x_{\rm cl}(t)=x'$  

 

 

and $\nu$ is an integer that increases by 1 each time the determinant vanishes along the classical path. $\nu$ is called the Maslov index. It is important to note that because the classical paths satisfy an endpoint problem, rather than an initial value problem, there can be more than one solution. In this case, one must sum the result over classical paths: 

 

 

\begin{displaymath}
U(x,x';t) = \sum_{{\rm classical\ paths}}
e^{{i \over \hbar}...
...rm det}
\left(-m{d^2 \over dt^2} - U''(x_{\rm cl}(t))\right)}}
\end{displaymath}

 

 

with a similar sum for the density matrix.Contributors

Mark Tuckerman (New York University)