Triangle Inequality/Vectors in Euclidean Space


This page has been identified as a candidate for refactoring of basic complexity.
In particular: Separate out the equality instance into a new page
Until this has been finished, please leave {{Refactor}} in the code.

New contributors: Refactoring is a task which is expected to be undertaken by experienced editors only.

Because of the underlying complexity of the work needed, it is recommended that you do not embark on a refactoring task until you have become familiar with the structural nature of pages of $\mathsf{Pr} \infty \mathsf{fWiki}$.

To discuss this page in more detail, feel free to use the talk page.
When this work has been completed, you may remove this instance of {{Refactor}} from the code.


Theorem

Let $\mathbf x, \mathbf y$ be vectors in the real Euclidean space $\R^n$.

Let $\norm {\, \cdot \,}$ denote vector length.

Then:

$\norm {\mathbf x + \mathbf y} \le \norm {\mathbf x} + \norm {\mathbf y}$

If the two vectors are scalar multiples where said scalar is non-negative, an equality holds:

$\exists \lambda \in \R, \lambda \ge 0: \mathbf x = \lambda \mathbf y \iff \norm {\mathbf x + \mathbf y} = \norm {\mathbf x} + \norm {\mathbf y}$


Proof

Let $\mathbf x, \mathbf y \in \R^n$.

We have:

\(\ds \norm {\mathbf x + \mathbf y}^2\) \(=\) \(\ds \paren {\mathbf x + \mathbf y} \cdot \paren {\mathbf x + \mathbf y}\) Dot Product of Vector with Itself
\(\ds \) \(=\) \(\ds \mathbf x \cdot \mathbf x + \mathbf x \cdot \mathbf y + \mathbf y \cdot \mathbf x + \mathbf y \cdot \mathbf y\) Dot Product Distributes over Addition
\(\ds \) \(=\) \(\ds \mathbf x \cdot \mathbf x + 2 \paren {\mathbf x \cdot \mathbf y} + \mathbf y \cdot \mathbf y\) Dot Product Operator is Commutative
\(\ds \) \(=\) \(\ds \norm {\mathbf x}^2 + 2 \paren {\mathbf x \cdot \mathbf y} + \norm {\mathbf y}^2\) Dot Product of Vector with Itself


From the Cauchy-Bunyakovsky-Schwarz Inequality:

\(\ds \size {\mathbf x \cdot \mathbf y}\) \(\le\) \(\ds \norm {\mathbf x} \norm {\mathbf y}\)
\(\ds \leadsto \ \ \) \(\ds \mathbf x \cdot \mathbf y\) \(\le\) \(\ds \norm {\mathbf x} \norm {\mathbf y}\) Negative of Absolute Value
\(\ds \leadsto \ \ \) \(\ds \norm {\mathbf x}^2 + 2 \paren {\mathbf x \cdot \mathbf y} + \norm {\mathbf y}^2\) \(\le\) \(\ds \norm {\mathbf x}^2 + 2 \paren {\norm {\mathbf x} \norm {\mathbf y} } + \norm {\mathbf y}^2\) multiply both sides with $2$, and add $\norm {\mathbf x}^2 + \norm {\mathbf y}^2$ to both sides
\(\ds \) \(=\) \(\ds \paren {\norm {\mathbf x} + \norm {\mathbf y} }^2\)
\(\ds \leadsto \ \ \) \(\ds \norm {\mathbf x + \mathbf y}^2\) \(\le\) \(\ds \paren {\norm {\mathbf x} + \norm {\mathbf y} }^2\)
\(\ds \leadsto \ \ \) \(\ds \norm {\mathbf x + \mathbf y}\) \(\le\) \(\ds \norm {\mathbf x} + \norm {\mathbf y}\) taking the square root of both sides

$\blacksquare$


To prove that the equality holds if the vectors are scalar multiples of each other, assume:

$\exists \lambda \in \R, \lambda \ge 0: \mathbf v = \lambda \mathbf w$


Sufficient Condition

\(\ds \norm {\mathbf v + \mathbf w}\) \(=\) \(\ds \norm {\lambda \mathbf w + \mathbf w}\)
\(\ds \) \(=\) \(\ds \norm {\paren {\lambda + 1} \mathbf w}\)
\(\ds \) \(=\) \(\ds \paren {\lambda + 1} \norm {\mathbf w}\)
\(\ds \) \(=\) \(\ds \lambda \norm {\mathbf w} + 1 \norm {\mathbf w}\)
\(\ds \) \(=\) \(\ds \norm {\lambda \mathbf w} + \norm {1 \mathbf w}\)
\(\ds \) \(=\) \(\ds \norm {\mathbf v} + \norm {\mathbf w}\)

$\Box$


Necessary Condition


This theorem requires a proof.
You can help $\mathsf{Pr} \infty \mathsf{fWiki}$ by crafting such a proof.
To discuss this page in more detail, feel free to use the talk page.
When this work has been completed, you may remove this instance of {{ProofWanted}} from the code.
If you would welcome a second opinion as to whether your work is correct, add a call to {{Proofread}} the page.



Sources

  • 2014: Christopher Clapham and James Nicholson: The Concise Oxford Dictionary of Mathematics (5th ed.) ... (previous) ... (next): triangle inequality (for vectors)
  • 2021: Richard Earl and James Nicholson: The Concise Oxford Dictionary of Mathematics (6th ed.) ... (previous) ... (next): triangle inequality