# 1.2: Matrix Mechanics

$$\newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} }$$ $$\newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$$$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}[1]{\| #1 \|}$$ $$\newcommand{\inner}[2]{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}[1]{\| #1 \|}$$ $$\newcommand{\inner}[2]{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$

Most of our work will make use of the matrix mechanics formulation of quantum mechanics. The wavefunction is written as $$|\Psi\rangle$$ and referred to as a ket vector. The complex conjugate $$\Psi^{*}=\langle\Psi|$$ is a bra vector, where $$\langle a \Psi|=a^{*}\langle\Psi|$$. The product of a bra and ket vector, $$\langle\alpha \mid \beta\rangle$$ is therefore an inner product (scalar), whereas the product of a ket and bra $$|\beta\rangle\langle\alpha|$$ is an outer product (matrix). The use of bra–ket vectors is the Dirac notation in quantum mechanics.

In the matrix representation, $$|\Psi\rangle$$ is represented as a column vector for the expansion coefficients $$c_{i}$$ in a particular basis set.

$|\Psi\rangle=\left(\begin{array}{c} c_{1} \\ c_{2} \\ c_{3} \\ \vdots \end{array}\right) \label{14}$

The bra vector $$\langle\Psi|$$ refers to a row vector of the conjugate expansion coefficients $$c_{i}^{*}$$. Since wavefunctions are normalized, $$\langle\Psi \mid \Psi\rangle=1$$. Dirac notation has the advantage of brevity, often shortening the wavefunction to a simple abbreviated notation for the relevant quantum numbers in the problem. For instance, we can write eq. (1.1.7) as

$|\Psi\rangle=\sum_{i} c_{i}|i\rangle \label{15}$

where the sum is over all eigenstates and the $$i^{\text {th}} \text { eigenstate }|i\rangle=\psi_{i}$$. Implicit in this equation is that the expansion coefficient for the $$i^{\text {th }} \text { eigenstate is } c_{i}=\langle i \mid \Psi\rangle$$. With this brevity comes the tendency to hide some of the variables important to the description of the wavefunction. One has to be aware of this, and although we will use Dirac notation for most of our work, where detail is required, Schrödinger notation will be used.

The outer product $$|i\rangle\langle i|$$ is known as a projection operator because it can be used to project the wavefunction of the system onto the $$i^{\mathrm{th}}$$ eigenstate of the system as $$|i\rangle\langle i \mid \Psi\rangle=c_{i}|i\rangle$$. Furthermore, if we sum projection operators over the complete basis set, we obtain an identity operator

$\sum_{i}|i\rangle\langle i|=1 \label{16}$

which is a statement of the completeness of a basis set. The orthogonality of eigenfunctions (eq. (1.1.8)) is summarized as $$\langle i \mid j\rangle=\delta_{i j}$$.

The operator $$\hat{A}$$ is a square matrix that maps from one state to another

$\hat{A}\left|\Psi_{0}\right\rangle=\left|\Psi_{A}\right\rangle \label{17}$

and from eq. (1.1.6) the TISE is

$\hat{H}|\Psi\rangle=E|\Psi\rangle \label{18}$

where E is a diagonal matrix of eigenvalues whose solution is obtained from the characteristic equation

$\operatorname{det}(H-E \mathbf{I})=0 \label{19}$

The expectation value, a restatement of eq. (1.1.10), is written

$\langle A\rangle=\langle\Psi|\hat{A}| \Psi\rangle \label{20}$

or from eq. (\ref{15})

$\langle A\rangle=\sum_{i} \sum_{j} c_{i}^{*} c_{j} A_{i j} \label{21}$

where $$A_{i j}=\langle i|A| j\rangle$$ are the matrix elements of the operator $$\hat{A}$$. As we will see later, the matrix of expansion coefficients $$\rho_{i j}=c_{i}^{*} c_{j}$$ is known as the density matrix. From eq. (\ref{18}), we see that the expectation value of the Hamiltonian is the energy of the system,

$E=\langle\Psi|H| \Psi\rangle \label{22}$

Hermitian operators play a special role in quantum mechanics. The Hermitian adjoint of an operator $$\hat{A} \text { is written } \hat{A}^{\dagger}$$, and is defined as the conjugate transpose of $$\hat{A}: \hat{A}^{\dagger}=\left(\hat{A}^{T}\right)^{*}$$. From this we see $$\langle\hat{A} \psi \mid \phi\rangle=\left\langle\psi \mid \hat{A}^{\dagger} \phi\right\rangle$$. A Hermitian operator is one that is self-adjoint, i.e., $$\hat{A}^{\dagger}=\hat{A}$$. For a Hermitian operator, a unique unitary transformation exists that will diagonalize it.

Each basis set provides a different route to representing the same physical system, and a similarity transformation S transforms a matrix from one orthonormal basis to another. A transformation from the state $$|\Psi\rangle \text { to the state }|\Phi\rangle$$ can be expressed as

$|\Theta\rangle=S|\Psi\rangle$

where the elements of the matrix are $$S_{i j}=\left\langle\theta_{i} \mid \psi_{j}\right\rangle$$. Then the reverse transformation is

$|\Psi\rangle=S^{\dagger}|\Theta\rangle$

Therefore $$S^{\dagger} S=1$$ and the transformation is said to be unitary. A unitary transformation refers to a similarity transformation in Hilbert space that preserves the scalar product, i.e., the length of the vector. The transformation of an operator from one basis to another is obtained from $$S^{\dagger} A S$$ and diagonalizing refers to finding the unitary transformation that puts the matrix A in diagonal form.

## Properties of operators

1. The inverse of $$\hat{A}\left(\text { written } \hat{A}^{-1}\right)$$ is defined by

$\hat{A}^{-1} \hat{A}=\hat{A} \hat{A}^{-1}=1$

2. The transpose of $$\hat{A}\left(\text { written } A^{T}\right)$$ is

$\left(A^{T}\right)_{n q}=A_{q n}$

If $$A^{T}=-A$$ then the matrix is antisymmetric.

3. The trace of $$\hat{A}$$ is defined as

$\operatorname{Tr}(\hat{A})=\sum_{q} A_{q q}$

The trace of a matrix is invariant to a similarity operation.

4. The Hermitian adjoint of $$\hat{A}\left(\text { written } \hat{A}^{\dagger}\right)$$ is

$\begin{array}{l} \hat{A}^{\dagger}=\left(\hat{A}^{T}\right)^{*} \\ \left(\hat{A}^{\dagger}\right)_{n q}=\left(\hat{A}_{q n}\right)^{*} \end{array}$

5. $$\hat{A}$$ is Hermitian if $$\hat{A}^{\dagger}=\hat{A}$$

$\left(\hat{A}^{T}\right)^{*}=\hat{A}$

If $$\hat{A}$$ is Hermitian, then $$\hat{A}^{n}$$ is Hermitian and $$e^{\hat{A}}$$ is Hermitian. For a Hermitian operator, $$\langle\psi \mid \hat{A} \varphi\rangle=\langle\psi \hat{A} \mid \varphi\rangle$$. Expectation values of Hermitian operators are real, so all physical observables are associated with Hermitian operators.

6. $$\hat{A}$$ is a unitary operator if its adjoint is also its inverse:

$\begin{array}{l} \hat{A}^{\dagger}=\hat{A}^{-1} \\ \left(\hat{A}^{T}\right)^{*}=\hat{A}^{-1} \\ \hat{A} \hat{A}^{\dagger}=1 \quad \Rightarrow \quad\left(\hat{A} \hat{A}^{\dagger}\right)_{n q}=\delta_{n q} \end{array}$

7. $$\hat{A}^{\dagger}=-\hat{A} \text { then } \hat{A}$$ is said to be antiHermitian. Anti-Hermetian operators have imaginary expectation values. Any operator can be decomposed into its Hermitian and anti-Hermitian parts as

$\begin{array}{l} \hat{A}=\hat{A}_{H}+\hat{A}_{A H} \\ \hat{A}_{H}=\frac{1}{2}\left(\hat{A}+\hat{A}^{\dagger}\right) \\ \hat{A}_{A H}=\frac{1}{2}\left(\hat{A}-\hat{A}^{\dagger}\right) \end{array}$

## Properties of commutators

From the definition of a commutator:

$[\hat{A}, \hat{B}]=\hat{A} \hat{B}-\hat{B} \hat{A}$

we find it is anti-symmetric to exchange:

$[\hat{A}, \hat{B}]=-[\hat{B}, \hat{A}]$

and distributive:

$[\hat{A}, \hat{B}+\hat{C}]=[\hat{A}, \hat{B}]+[\hat{B}, \hat{C}]$

These properties lead to a number of useful identities:

$\left[\hat{A}, \hat{B}^{n}\right]=n \hat{B}^{n-1}[\hat{A}, \hat{B}]$

$\left[\hat{A}^{n}, \hat{B}\right]=n \hat{A}^{n-1}[\hat{A}, \hat{B}]$

$[\hat{A}, \hat{B} \hat{C}]=[\hat{A}, \hat{B}] \hat{C}+\hat{B}[\hat{A}, \hat{C}]$

$[[\hat{C}, \hat{B}], \hat{A}]=[[\hat{A}, \hat{B}], \hat{C}]$

$\begin{array}{l} {[\hat{A},[\hat{B}, \hat{C}]]+[\hat{B},[\hat{C}, \hat{A}]]} \\ \quad+[\hat{C},[\hat{A}, \hat{B}]]=0 \end{array}$

The Hermetian conjugate of a commutator is

$[\hat{A}, \hat{B}]^{\dagger}=\left[\hat{B}^{\dagger}, \hat{A}^{\dagger}\right]$

Also, the commutator of two Hermitian operators is also Hermitian. The anti-commutator is defined as

$[\hat{A}, \hat{B}]_{+}=\hat{A} \hat{B}+\hat{B} \hat{A}$

and is symmetric to exchange. For two Hermitian operators, their product can be written in terms of the commutator and anti-commutator as

$\hat{A} \hat{B}=\frac{1}{2}[\hat{A}, \hat{B}]+\frac{1}{2}[\hat{A}, \hat{B}]_{+}$

The anti-commutator is the real part of the product of two operators, whereas the commutator is the imaginary part.

1.2: Matrix Mechanics is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Andrei Tokmakoff via source content that was edited to conform to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.