Geometric Interpretation: The "tip-to-tail" rule or parallelogram rule. Place the tail of vector \(\mathbf{v}\) at the tip of vector \(\mathbf{u}\). The sum \(\mathbf{u} + \mathbf{v}\) is the vector from the origin (tail of \(\mathbf{u}\)) to the tip of \(\mathbf{v}\). It forms the diagonal of the parallelogram spanned by \(\mathbf{u}\) and \(\mathbf{v}\).
Definition: Multiplying a vector by a scalar \(c\), which scales the vector's magnitude and/or reverses its direction.
Algebraic Rule: If \(c\) is a scalar and \(\mathbf{v} = [v_1, v_2, ..., v_n]^T\), then:
$$ c\mathbf{v} = \begin{bmatrix} c v_1 \ c v_2 \ \vdots \ c v_n \end{bmatrix} $$
Geometric Interpretation:
Multiplying by \(c > 1\) stretches the vector.
Multiplying by \(0 < c < 1\) shrinks the vector.
Multiplying by \(c = -1\) reverses the vector's direction.
Multiplying by \(c < 0\) stretches/shrinks and reverses direction.
The resulting vector \(c\mathbf{v}\) lies on the same line through the origin as \(\mathbf{v}\).
Definition: A vector formed by adding scalar multiples of other vectors. Given vectors \(\mathbf{v}_1, \mathbf{v}_2, ..., \mathbf{v}_k\) and scalars \(c_1, c_2, ..., c_k\), a linear combination is:
$$ \mathbf{w} = c_1\mathbf{v}1 + c_2\mathbf{v}_2 + \dots + c_k\mathbf{v}_k = \sum_i $$}^k c_i \mathbf{v
Definition: An operation between two vectors of the same dimension (\(n\)) that results in a single scalar value.
Algebraic Rule: If \(\mathbf{u} = [u_1, ..., u_n]^T\) and \(\mathbf{v} = [v_1, ..., v_n]^T\), their dot product (denoted \(\mathbf{u} \cdot \mathbf{v}\) or \(\mathbf{u}^T\mathbf{v}\)) is:
$$ \mathbf{u} \cdot \mathbf{v} = \mathbf{u}^T\mathbf{v} = u_1 v_1 + u_2 v_2 + \dots + u_n v_n = \sum_{i=1}^n u_i v_i $$
(Note: \(\mathbf{u}^T\mathbf{v}\) emphasizes viewing it as matrix multiplication of a \(1 \times n\) row vector by an \(n \times 1\) column vector).
Geometric Interpretation: Relates to the angle \(\theta\) between the two vectors and their magnitudes (L₂ norms)\(||\mathbf{u}||_2, ||\mathbf{v}||_2\):
$$ \mathbf{u} \cdot \mathbf{v} = ||\mathbf{u}||_2 \, ||\mathbf{v}||_2 \cos(\theta) $$
Calculating Angle Between Vectors:\(\cos(\theta) = \frac{\mathbf{u} \cdot \mathbf{v}}{||\mathbf{u}||_2 \, ||\mathbf{v}||_2}\).
Checking Orthogonality: Two non-zero vectors \(\mathbf{u}\) and \(\mathbf{v}\) are orthogonal (perpendicular) if and only if their dot product is zero (\(\mathbf{u} \cdot \mathbf{v} = 0\), since \(\cos(90^\circ) = 0\)).
Projection: Calculating the projection of one vector onto another involves the dot product. \(\text{proj}_{\mathbf{v}} \mathbf{u} = \frac{\mathbf{u} \cdot \mathbf{v}}{||\mathbf{v}||_2^2} \mathbf{v}\).
Similarity Measure: In ML/NLP, the cosine similarity (\(\cos(\theta)\)) measures the similarity in direction between vectors (e.g., word embeddings).
Weighted Sums: If \(\mathbf{w}\) is a vector of weights and \(\mathbf{x}\) is a feature vector, \(\mathbf{w} \cdot \mathbf{x} = \mathbf{w}^T \mathbf{x}\) calculates the weighted sum, a core operation in linear models and neural networks.
Vector Addition (\(\mathbf{u}+\mathbf{v}\)): Component-wise addition (Tip-to-tail geometry). Requires same dimension.
Scalar Multiplication (\(c\mathbf{v}\)): Multiply each component by a scalar (scales vector length/direction).
Dot Product (\(\mathbf{u} \cdot \mathbf{v}\) or \(\mathbf{u}^T\mathbf{v}\)): Sum of component-wise products (\(\sum u_i v_i\)). Results in a scalar. Geometric meaning: \(||\mathbf{u}||_2 \, ||\mathbf{v}||_2 \cos(\theta)\). Used for length, angle, orthogonality checks, projections, weighted sums. Requires same dimension.