AHL 1.15: EIGENVALUES & EIGENVECTORS
📌 Key terms & quick notes
| Term | Definition / Notes |
|---|---|
| Eigenvalue (λ) | Scalar λ for which a non-zero vector v exists with A v = λ v. Solve via characteristic polynomial det(A − λ I) = 0. |
| Eigenvector (v) | Non-zero vector direction preserved by A (only scaled). Eigenspace = nullspace of (A − λ I). |
| Characteristic polynomial | p(λ) = det(A − λ I). For 2×2: p(λ) = λ2 − (tr A) λ + det A. |
| Trace / Determinant | tr A = sum diag entries = λ1 + λ2. det A = product λ1 × λ2. |
| Algebraic / Geometric multiplicity | Algebraic = root multiplicity in p(λ). Geometric = dimension of eigenspace (≤ algebraic). |
| Diagonalizable | A = P D P−1 with independent eigenvectors as columns of P. Distinct eigenvalues ⇒ diagonalizable (2×2). |
• Use matrix → eigen routines to compute eigenvalues and eigenvectors quickly.
• For powers An, diagonalize (A = P D P−1) then Dn just raises diagonal entries.
• Verify results by computing A v − λ v; numerical residual should be ≈ 0. Use rational/exact mode when available.
📌 Geometric intuition
• Direction preservation: eigenvectors are directions that A maps onto the same line (scaled by λ). They are the “axes” along which the linear transformation purely stretches/compresses (or flips).
• Magnitude effect: |λ| < 1 ⇒ decay along that mode; |λ| > 1 ⇒ growth. λ negative ⇒ flip sign (mirror) plus scaling. Complex λ give rotation+scaling.
• Modes: decomposing an initial state into eigen-components isolates independent behaviours: each component evolves as λt (discrete) or eλ t (continuous).
📌 Step-by-step computations & meaning
- Form A − λI and compute det(A − λI): detects λ that make matrix singular → non-trivial solutions for v.
- Solve p(λ)=0: yields the scaling factors (modes) — algebraic roots that indicate possible behaviours.
- Solve (A − λI) v = 0: finds eigendirections v (geometric meaning: preserved lines).
- Diagonalize: A = P D P−1 means change to eigenbasis where A acts as scaling (D). This simplifies repeated application and interpretation.
- Use decomposition to compute An: Dn = diag(λin) so long-term behaviour dominated by eigenvalues with largest |λ|.
📌 Holistic example — two-town transition
Model: xt = [At, Bt]T, At+1/Bt+1 = A × xt with
A = [[0.90, 0.20], [0.10, 0.80]]
- Characteristic: p(λ) = λ2 − (tr A) λ + det A. Here tr A = 1.70; det A = 0.70 → p(λ) = λ2 − 1.70λ + 0.70.
- Solve → λ1=1, λ2=0.70. Interpretation: λ=1 gives steady-state (persistent mode); 0.70 decays (transient).
- Eigenvector for λ=1 (solve (A−I)v=0) gives long-term population fractions; initial conditions different only affect transient coefficients.
Conclusion: eigen-analysis separates steady distribution (end behaviour) from decaying components (how fast equilibrium is reached).
Use a simple transition dataset (migration, market share) so you can compute eigenpairs and perform sensitivity (±δ changes). Explain model assumptions and validate with GDC and hand-calculations. Discuss limitations and measurement error impact on eigenvalues/eigenvectors.
📌 Stability & continuous analogues
• Discrete: xt+1 = A xt stable if all |λ| < 1; dominated by largest |λ| otherwise.
• Continuous: x'(t) = B x(t) solved via eBt; eigenvalues of B (real parts) govern stability (Re(λ) < 0 stable).
📌 Numerical cautions & exam checks
- Close eigenvalues: ill-conditioning — small A changes alter eigenvectors strongly.
- Repeated eigenvalue: check geometric multiplicity; if less than algebraic, not diagonalizable — show dimension of nullspace.
- Floating point: prefer rational/exact results when possible; state rounding explicitly where needed.
- Verification: always compute A v − λ v (or residual) to validate eigenpairs numerically.
Show determinant computation (det(A − λI)), factor the characteristic polynomial, and solve (A − λI)v = 0. If eigenvalues repeat, compute nullspace dimension and explain diagonalizability. When using the GDC, include a short algebraic justification to secure method marks.
How do idealizations (linearity) shape what we consider knowledge? Discuss assumptions, model domains and ethical consequences of applying simplified linear models to policy decisions.
Markov chains, population transitions, Google’s PageRank, vibration modes in engineering — all use eigen-analysis to separate persistent modes from transients and to identify dominant behaviour.
📌 Glossary & symbols
| Symbol | Meaning |
|---|---|
λ |
Eigenvalue (scalar) |
v |
Eigenvector (column) |
P |
Matrix of eigenvectors (columns) |
D |
Diagonal matrix of eigenvalues |
An |
Matrix power — use diagonalisation for efficient computation |
Practice forming det(A − λI), solving polynomials, and computing eigenspaces. If using GDC, display the algebraic steps too (polynomial factorisation, nullspace dimension). When eigenvalues repeat, explicitly show geometric multiplicity check.