# 1.3: Matrix

- Page ID
- 183433

A **matrix** is a rectangular array of quantities or expressions in rows (m) and columns (n) that is treated as a single entity and manipulated according to particular rules.^{[1]} The dimension of a matrix is denoted by m × n. In inorganic chemistry, molecular symmetry can be modeled by mathematics by using group theory. The internal coordinate system of a molecule may be used to generate a set of matrices, known as a representation, that corresponds to particular symmetry operations.^{[2]} Matrix modeling thus allows for symmetry operations performed on the molecule to be represented in an identical fashion mathematically.

## Matrix Operations

### Addition

The sum of two matrices, **A** and **B**, is carried out by adding or subtracting the element of one matrix with the corresponding element of another matrix. These operations may only be performed on matrices of identical dimension.

\(A+B=\displaystyle \sum_{i=1}^{m}\left(\sum_{i=1}^{n} A_{i j}+B_{i j}\right)\) where *i* refers to a particular row and *j* to a particular column.

Example:

\[\left(\begin{array}{lll}{a_{11}} & {a_{12}} & {a_{13}} \\{a_{21}} & {a_{22}} & {a_{23}}\end{array}\right)+\left(\begin{array}{lll}{b_{11}} & {b_{12}} & {b_{13}} \\{b_{21}} & {b_{22}} & {b_{23}}\end{array}\right)=\left(\begin{array}{lll}{a_{11}+b_{11}} & {a_{12}+b_{12}} & {a_{13}+b_{13}} \\{a_{21}+b_{21}} & {a_{22}+b_{22}} & {a_{23}+b_{23}}

\end{array}\right) \nonumber\]

### Scalar Multiplication

Multiplication of a matrix by a scalar, *c*, multiplies every element within the matrix by the scalar.

\[c A=C \cdot A_{i j} \nonumber\]

Example:

\[c \cdot\left(\begin{array}{ll}{a_{11}} & {a_{12}} \\{a_{21}} & {a_{22}}\end{array}\right)=\left(\begin{array}{ll}

{c \cdot a_{11}} & {c \cdot a_{12}} \\{c \cdot a_{21}} & {c \cdot a_{22}}\end{array}\right) \nonumber\]

### Matrix Multiplication

Matrix multiplication entails computing the dot product of the row of one matrix, **A**, with the column of another matrix, **B**. Matrix multiplication is only defined if the number if columns of **A**, denoted by n, is equal to the number of rows of **B**, denoted by m. Their product is then the m × n matrix, **C**. Matrix multiplication entails some mathematical properties. First, it is associative; in other words, (**A** × **B**) × **C** = **A** × (**B** × **C**). Furthermore, matrix multiplication is not commutative; in other words, **A** × **B** =/= **B** × **A**

\[\mathrm{C}_{\mathrm{m}\times\mathrm{n}}=\mathrm{A}_{\mathrm{m}\times\mathrm{c}}\cdot \mathrm{B}_{\mathrm{c}\times \mathrm{n}}=\sum_{k=1}^{c} A_{i, k} B_{k, j}\nonumber\]

Example:

\[\left(\begin{array}{lll}{a_{11}}&{a_{12}}&{a_{13}}\\{a_{21}}&{a_{22}}&{a_{23}}\end{array}\right)\times\left(\begin{array}{l}{b_{11}}\\{b_{21}}\\{b_{31}}\end{array}\right)=\left(\begin{array}{l}{a_{11}b_{11}+a_{12}b_{21}+a_{13}b_{31}}\\{a_{21}b_{11}+a_{22}b_{21}+a_{23} b_{31}}\end{array}\right)\nonumber\]

### Row Operations

There are three kinds of elementary row operations that are used to transform a matrix:

Type | Definition | Operation |
---|---|---|

Row Switching | The swapping of one row with that of another row | \(\mathrm{R}_{i} \leftrightarrow \mathrm{R}_{j}\) |

Row Addition | The addition of a multiple of one row to another row |
\(\mathrm{R}_{i}+k \mathrm{R}_{j} \rightarrow \mathrm{R}_{i}\) |

Row Multiplication | Multiplication of a row by a scalar, c, with c ≠ 0 |
\(c \mathrm{R}_{i}\rightarrow \mathrm{R}_{i}\) |

## Square Matrices

Square matrices are matrices where the number of rows and number of columns are equal, resulting in an n × n matrix.

### Identity Matrix

The identity matrix, **I**_{n}, is a diagonal matrix which has all elements along the main diagonal equal to 1 and all other elements equal to 0. Multiplication of another matrix by the identity matrix leaves the first unchanged. Moreover, multiplication with the identity matrix is commutative; in other words, **A** × **I** = **I** × **A**.

Example:

\[A \cdot I_{3}=\left(\begin{array}{lll}{a} & {b} & {c} \\{d} & {e} & {f}\end{array}\right) \cdot\left(\begin{array}{lll}{1} & {0} & {0} \\{0} & {1} & {0} \\{0} & {0} & {1}\end{array}\right)=\left(\begin{array}{lll}{a} & {b} & {c} \\{d} & {e} & {f}\end{array}\right)\nonumber\]

### Trace

Only applicable to square matrices, the trace or *character,* *,* of a matrix is the sum of its diagonal entries along the main diagonal.

### Determinant

The determinant of a matrix, denoted det(**A**), is a real number computed from a square matrix. A non-zero determinant implies matrix invertibility, which further implies that the set of linear equations comprising the matrix has exactly one solution.

For a 2 × 2 matrix, the determinant is computed as follows:

\[\operatorname{det}(\mathbf{A})=\left|\begin{array}{ll}{a} & {b} \\{c} & {d}\end{array}\right|=a d-b c\nonumber\]

For a 3 × 3 matrix, the determinant is computed as follows:

\[\operatorname{det}(\mathbf{A})=\left|\begin{array}{ccc}{a} & {b} & {c} \\{d} & {e} & {f} \\{g} & {h} & {i}\end{array}\right|=a\left|\begin{array}{cc}{e} & {f} \\{h} & {i}\end{array}\right|-b\left|\begin{array}{cc}{d} & {f} \\{g} & {i}\end{array}\right|+c\left|\begin{array}{cc}{d} & {e} \\{g} & {h}

\end{array}\right|\nonumber\]

Higher order determinants may be calculated by using Cramer's Rule.

## References

- "Matrix [Def. 3]. In Oxford Dictionaries (American English) (US)"].
- ↑ Pfenning, Brian W. (2015).
*Principles of Inorganic Chemistry*. Hoboken: John Wiley & Sons, Inc.. pp. 195. ISBN 9781118859100.