These are classifications based on how a matrix behaves when multiplied by itself.
A square matrix A is idempotent if A² = A.
Multiplying it by itself gives the same matrix back. The identity matrix (I) and the zero matrix (0) are simple examples.
A square matrix A is nilpotent if Aᵏ = 0 (the zero matrix) for some positive integer k.
The smallest such `k` is called the index of nilpotency.
A square matrix A is involutory if A² = I (the identity matrix).
An involutory matrix is its own inverse (A = A⁻¹). The identity matrix (I) is a simple example. A reflection matrix is also involutory.
The transpose of a matrix A, denoted Aᵀ, is the matrix obtained by interchanging its rows and columns.
If A is `m × n`, then Aᵀ is `n × m`. The element (Aᵀ)ᵢⱼ = Aⱼᵢ.
The conjugate of a matrix A, denoted Ā, is the matrix obtained by replacing each element with its complex conjugate. (e.g., `a + bi` becomes `a - bi`).
If a matrix contains only real numbers, then Ā = A.
The transpose-conjugate (or Hermitian conjugate) of A is the transpose of the conjugate: A* = (Ā)ᵀ.
This operation is crucial for complex matrices, similar to what the transpose is for real matrices.
These matrices describe different types of "symmetry" in square matrices.
A square matrix A is symmetric if it is equal to its transpose: Aᵀ = A.
This means `aᵢⱼ = aⱼᵢ` for all i, j. The matrix is a "mirror image" across its main diagonal.
A square matrix A is skew-symmetric if it is equal to the negative of its transpose: Aᵀ = -A.
This means `aᵢⱼ = -aⱼᵢ`. This implies that all main diagonal elements must be zero (since `aᵢᵢ = -aᵢᵢ` means `2aᵢᵢ = 0`).
A square matrix A is Hermitian if it is equal to its transpose-conjugate: A* = A.
This means `aᵢⱼ = āⱼᵢ`. This implies that all main diagonal elements must be real numbers (since `aᵢᵢ = āᵢᵢ`).
A square matrix A is skew-Hermitian if it is equal to the negative of its transpose-conjugate: A* = -A.
This means `aᵢⱼ = -āⱼᵢ`. This implies that all main diagonal elements must be purely imaginary or zero (since `aᵢᵢ = -āᵢᵢ`).
First, the cofactor Cᵢⱼ of an element aᵢⱼ is the determinant of the submatrix obtained by deleting row `i` and column `j`, multiplied by `(-1)ⁱ⁺ʲ`.
The adjoint of a square matrix A, denoted adj(A), is the transpose of its cofactor matrix.
Steps to find adj(A):
A square matrix A is invertible (or non-singular) if there exists a matrix A⁻¹ such that AA⁻¹ = A⁻¹A = I.
A matrix A is invertible if and only if its determinant is non-zero: det(A) ≠ 0.
The inverse of an invertible matrix A is given by the formula:
A⁻¹ = (1 / det(A)) * adj(A)
This formula connects the inverse, determinant, and adjoint, and is a fundamental property of matrices.
This theorem provides a direct relationship between the determinant of a matrix and the determinant of its adjoint. It is derived from the property A · adj(A) = det(A) · I.
Jacobi's Theorem states:
If A is an invertible square matrix of order n, then the determinant of its adjoint is:
det(adj(A)) = (det(A))ⁿ⁻¹