Socratica Logo

Orthogonality

Mathematics > Linear Algebra > Orthogonality

Orthogonality in Linear Algebra

Orthogonality is a fundamental concept in linear algebra that deals with the relationships and angles between vectors in vector spaces. It extends the familiar notion of perpendicularity from Euclidean geometry into more abstract settings. In linear algebra, two vectors are considered orthogonal if their dot product (also known as the inner product) is zero. This relationship can be generalized to various dimensions and different inner product spaces.

Dot Product and Orthogonality

For vectors \(\mathbf{u}\) and \(\mathbf{v}\) in \(\mathbb{R}^n\), the dot product is defined as:

\[
\mathbf{u} \cdot \mathbf{v} = u_1v_1 + u_2v_2 + \cdots + u_nv_n = \sum_{i=1}^n u_i v_i
\]

Vectors \(\mathbf{u}\) and \(\mathbf{v}\) are orthogonal if:

\[
\mathbf{u} \cdot \mathbf{v} = 0
\]

This condition implies that the angle \(\theta\) between \(\mathbf{u}\) and \(\mathbf{v}\) is \(90^\circ\) or \(\frac{\pi}{2}\) radians, since the dot product can also be represented as:

\[
\mathbf{u} \cdot \mathbf{v} = \|\mathbf{u}\|\|\mathbf{v}\|\cos(\theta)
\]

When \(\cos(\theta) = 0\), \(\theta\) is \(90^\circ\), confirming orthogonality.

Orthogonal Sets and Orthogonal Bases

An orthogonal set in a vector space is a collection of vectors where each pair of distinct vectors is orthogonal. Formally, a set of vectors \(\{\mathbf{u}_1, \mathbf{u}_2, \ldots, \mathbf{u}_m\}\) is orthogonal if:

\[
\mathbf{u}_i \cdot \mathbf{u}_j = 0 \quad \text{for} \quad i \neq j
\]

Orthogonal sets are particularly significant when they form a basis, known as an orthogonal basis, for a vector space. If each vector in the orthogonal basis also has unit length (norm equal to one), it forms an orthonormal basis.

Projections and Orthogonal Complements

Orthogonality is crucial in finding projections of vectors onto subspaces. Given a vector \(\mathbf{v}\) and a subspace \(W\), the projection of \(\mathbf{v}\) onto \(W\), denoted as \(\text{proj}_W \mathbf{v}\), is the vector in \(W\) that is closest to \(\mathbf{v}\). If \(\{\mathbf{u}_1, \mathbf{u}_2, \ldots, \mathbf{u}_k\}\) is an orthogonal basis for \(W\), then:

\[
\text{proj}W \mathbf{v} = \sum{i=1}^k \frac{\mathbf{v} \cdot \mathbf{u}_i}{\mathbf{u}_i \cdot \mathbf{u}_i} \mathbf{u}_i
\]

The concept of orthogonal complements is also pivotal. The orthogonal complement of a subspace \(W\) in a vector space \(V\) is the set of all vectors in \(V\) that are orthogonal to every vector in \(W\). This orthogonal complement is denoted as \(W^\perp\). Formally,

\[
W^\perp = \{\mathbf{v} \in V \mid \mathbf{v} \cdot \mathbf{w} = 0 \ \forall \ \mathbf{w} \in W\}
\]

Understanding orthogonality facilitates various applications in mathematics, physics, computer science, and engineering, particularly in solving systems of linear equations, performing eigenvalue decompositions, and implementing algorithms in machine learning and data analysis.