Eigenvalue Calculator

Find eigenvalues and eigenvectors of 2x2 and 3x3 matrices. Solves det(A-lambda*I)=0, verifies Av=lambda*v, checks diagonalizability, and covers applications in PCA, PageRank, and quantum mechanics.

Eigenvalue lambda1
Eigenvalue lambda2
Characteristic polynomial
Eigenvector for lambda1
Eigenvector for lambda2
Extended More scenarios, charts & detailed breakdown
lambda1
lambda2
Discriminant
Trace = sum(lambda)
Det = product(lambda)
Professional Full parameters & maximum detail

Eigenvalues & Spectrum

lambda1
lambda2
Spectral radius

Diagonalization & Applications

Diagonalization note
Applications

How to Use This Calculator

  1. Enter a 2x2 matrix to get eigenvalues and eigenvectors instantly.
  2. Use 3x3 tab for the characteristic polynomial of a larger matrix.
  3. Use Verify tab to confirm a specific eigenpair Av=lambda*v.

Formula

Characteristic equation: det(A - lambda*I) = 0

2x2: lambda = (tr +/- sqrt(tr^2 - 4*det)) / 2

Example

A=[[4,1],[2,3]]: tr=7, det=10, disc=9. lambda1=5, lambda2=2. Eigenvector for lambda1=5: [1,1].

Frequently Asked Questions

  • An eigenvalue of a square matrix A is a scalar λ such that Av = λv for some nonzero vector v, called the corresponding eigenvector. The equation Av = λv means: when A acts on v, the result is simply v scaled by λ — no rotation, only stretching or flipping. If λ = 2, the eigenvector doubles in length; if λ = −1, it reverses direction. Example: A = [[4, 1], [2, 3]], v = [1, 1]. Av = [5, 5] = 5·[1, 1], so λ = 5 with eigenvector [1, 1]. A common pitfall is confusing eigenvalues with diagonal entries — a diagonal matrix does have its diagonal as eigenvalues, but that is a special case, not the general rule.
  • To find eigenvalues, solve the characteristic equation det(A − λI) = 0. Setting up A − λI shifts the diagonal entries by −λ, then the determinant is set to zero to find λ. For a 2×2 matrix this gives a quadratic: λ² − tr(A)·λ + det(A) = 0. Example: A = [[4, 1], [2, 3]]. tr = 7, det = 12 − 2 = 10. Characteristic equation: λ² − 7λ + 10 = 0. Factor: (λ − 5)(λ − 2) = 0, so λ₁ = 5 and λ₂ = 2. Once eigenvalues are known, substitute back into (A − λI)v = 0 to find each eigenvector. Common pitfall: forgetting to subtract λI — using det(A) − λ instead of det(A − λI) gives wrong results.
  • For a 2×2 matrix, the trace tr(A) = a11 + a22 equals the sum of the eigenvalues (λ₁ + λ₂), and the determinant det(A) equals their product (λ₁ × λ₂). This follows from Vieta's formulas applied to the characteristic polynomial λ² − tr·λ + det = 0. Example: A = [[4, 1], [2, 3]]. tr = 7 = 5 + 2 ✓. det = 10 = 5 × 2 ✓. These two checks let you verify eigenvalue calculations quickly without expanding the characteristic polynomial again. For n×n matrices, tr(A) = sum of all eigenvalues and det(A) = product of all eigenvalues — a useful sanity check even for 3×3 systems.
  • For a 2×2 real matrix, eigenvalues are complex when the discriminant of the characteristic polynomial is negative: tr(A)² − 4·det(A) < 0. Complex eigenvalues always appear in conjugate pairs a ± bi. Example: A = [[0, −1], [1, 0]]. tr = 0, det = 1. Discriminant = 0 − 4 = −4 < 0. Eigenvalues: λ = (0 ± √(−4)) / 2 = ±i. Geometrically, complex eigenvalues correspond to rotation in 2D — the matrix rotates vectors rather than just scaling them. The real part a gives the growth or decay rate, and the imaginary part b gives the angular frequency. A common mistake is concluding that no real solution exists in an absolute sense — the matrix still acts on real vectors, but its behavior is rotational rather than purely stretching.
  • Eigenvalues and eigenvectors appear throughout science and engineering. In Google's PageRank algorithm, the ranking of web pages is the dominant eigenvector of the link matrix. In Principal Component Analysis (PCA), the eigenvectors of the covariance matrix define the directions of maximum variance — essential for data compression. In quantum mechanics, the energy levels of a system are the eigenvalues of the Hamiltonian operator. In structural engineering, the natural vibration frequencies of a bridge or building are eigenvalues of the mass-stiffness matrix. In Markov chains, the steady-state distribution is the eigenvector corresponding to eigenvalue 1. Common pitfall: eigenvalues are only defined for square matrices — for non-square systems, singular value decomposition (SVD) provides an analogous decomposition.

Related Calculators

Sources & References (5)
  1. Introduction to Linear Algebra — Gilbert Strang — Wellesley-Cambridge Press
  2. OpenStax College Algebra 2e — Eigenvalues — OpenStax
  3. MIT OCW 18.06 — Eigenvalues and Eigenvectors — MIT OpenCourseWare
  4. Khan Academy — Eigenvalues and Eigenvectors — Khan Academy
  5. MathWorld — Eigenvalue — Wolfram Research