Think of the determinant of a square matrix as a scalar value that tells you about the scaling factor of the linear transformation represented by that matrix.
Specifically, it tells you how much the area (in 2D) or volume (in 3D) changes when you apply the matrix transformation. A determinant of 0 means the transformation squashes space into a lower dimension (like flattening a 3D object onto a plane or a 2D shape onto a line).
The determinant is a scalar value that can only be computed for a square matrix\(\mathbf{A}\) (\(n \times n\)). It is denoted as \(\det(\mathbf{A})\) or \(|\mathbf{A}|\).
Its calculation encodes properties related to the matrix's invertibility and the geometric scaling effect of the corresponding linear transformation.
For a 2x2 Matrix: If \(\mathbf{A} = \begin{bmatrix} a & b \\ c & d \end{bmatrix}\), then:
$$ \det(\mathbf{A}) = |\mathbf{A}| = ad - bc $$
For a 3x3 Matrix: Using cofactor expansion (e.g., along the first row):
$$ |\mathbf{A}| = a \begin{vmatrix} e & f \ h & i \end{vmatrix} - b \begin{vmatrix} d & f \ g & i \end{vmatrix} + c \begin{vmatrix} d & e \ g & h \end{vmatrix} $$
$$ |\mathbf{A}| = a(ei - fh) - b(di - fg) + c(dh - eg) $$
(Where \(\begin{vmatrix} \dots \end{vmatrix}\) indicates the determinant of the 2x2 submatrix).
For Larger Matrices: Cofactor expansion can be used recursively, but it becomes computationally very expensive (\(\mathcal{O}(n!)\)). Methods based on row reduction (Gaussian elimination) to reach triangular form are more practical (\(\mathcal{O}(n^3)\)):
Swapping two rows multiplies the determinant by -1.
Multiplying a row by a scalar \(k\) multiplies the determinant by \(k\).
Adding a multiple of one row to another does not change the determinant.
The determinant of a triangular matrix (upper or lower) is the product of its diagonal entries.
\(|\det(\mathbf{A})|\) represents the factor by which area (in 2D) or volume (in 3D) or hypervolume (in higher dimensions) is scaled when applying the linear transformation defined by \(\mathbf{A}\).
\(\det(\mathbf{A}) \neq 0\) if and only if \(\mathbf{A}\) is invertible.
\(\det(\mathbf{I}) = 1\) (The Identity matrix doesn't scale volume or change orientation).
\(\det(\mathbf{A}^T) = \det(\mathbf{A})\) (Transpose has the same determinant).
\(\det(\mathbf{AB}) = \det(\mathbf{A}) \det(\mathbf{B})\) (Determinant of a product is the product of determinants). This reflects composing scaling factors of transformations. Requires \(\mathbf{A}, \mathbf{B}\) to be square matrices of the same size.