Vectors & Matrices

The fundamental building blocks of linear algebra and modern computation.

S+TzRf4XtxLAYCDhTycLVab+Wecs+ni613bplNE+rQ/82MdACRa6hGnnud/GXXcEJfldASjvidjUhzwO1TUtcDjEHxfiQIc0mM5lt57b8Lku4rYx8RTyHTdGaZRZHfQbNwCs+EljRuC0mVDIdrHHQwy00SNA+5Ze7Ng2lp9hOeXbP27bhmJDcPr3H1YD/rJPyphb7xIsR9uIC17L9auQtkP+sq22rJr7GQ0k52ENDQbaeLxmGIqJKXmj8diQ2jqVI4tigO6eEjJZIW2QMorwSaXtLNc7jJpvA0fOrugOWw6E6RPYfV1UaEFMF3o6wkAu+RItDF3ZFlUop/ggdiMH30C8ndpi4oojnFfZPy1Q0MrXY9Sk8CNHpUgmeVDM+ftBU43+zpgSaij5dt/qeJEWsQffo+srbfbQKOLcVHko7bwV6keoiasvtQ/5kw4cXMkiGXja5hTKDIiCbPkEFsGGVA0yow2RnZTIc7l5VqEm3fjQ5/0EfJLGl/SHiJQyIvKr9EBb8GfMh5Ym1aIBoMP4WeEVIMz+fimsRuPsuHp0HjL9vo3Yaq2HOygJGAjAvgruSWhevbhohaue2nPJh3JMJqm1dzUGgMb7DoPy643NBB7WSKQB016NMSEMdfdupgHabLShNpPiWcAvFOVLOy50yb+siaoBkIZnDefsS6qlv2/zE2I5AvKNQRXJ3Hbt8Q5l6g13aZiofJnroAaHUjCVm/8wXx5uVAFQ/jYw/aPD5Hn9Sv1FT+6wj

Vectors

A vector is an ordered list of numbers representing magnitude and direction. In ℝⁿ, vectors can represent points, displacements, or abstract data.

Vector: v = (v₁, v₂, …, vₙ)
Magnitude: ‖v‖ = √(v₁² + v₂² + … + vₙ²)
Dot product: u·v = u₁v₁ + u₂v₂ + … + uₙvₙ
cos θ = (u·v)/(‖u‖·‖v‖)

The magnitude formula generalizes the distance formula from coordinate geometry. The dot product connects to trigonometric functions via the angle between vectors.

Matrices

A matrix is a rectangular array of numbers. An m×n matrix has m rows and n columns.

Identity matrix: I (1s on diagonal, 0s elsewhere)
Transpose: (Aᵀ)ᵢⱼ = Aⱼᵢ
Symmetric: A = Aᵀ

Special matrices: diagonal, upper/lower triangular, symmetric, orthogonal (QᵀQ = I). The identity matrix I acts like 1 in matrix multiplication — similar to how 1 is the multiplicative identity.

/ht6itUVEgJRg9mAHZ0l8X5kZDMkBeXXCRbaF+HmbjCo8hzcwTsAsaNNn2YeiC6CwuwzAVxpQn+/nIf4TKUZmBxQmZwwD6+63ezbStPy81HhAINgOiFCYXaih7p+HIhNoTSIK7wFpaCw9EJE2edWqVnqpMTURf9dGbSprOHmKDdD1RWIgMWBkRyLmbW5jtFXpgZsHdDV91CmyiFLebQUS8EKRd8iKZPtI9N1x8gRL7mN4SyWRWa32oTfh4s+stgh43WmVdu0VCooKvBAXENA3byqtw6RiWqYpW3pXCyv+yYvZkOBmfA5yM19PFaS2gIULHIDGJH6LJmOM8dadafSR4f/CxpgSuUz3qsVCUfixxj9gM9qATrqJwZypIM09hUeTLwAjE9zyFqEp4y0mN6hB0IA7uM1bSSH3+jHpsqjusBVPJdBQXbtWqgtiPXGFl1wjpKOHcjHgyooCMyCtZreHFwrqdPnXsVg5CM6EBkxXgxPkI4mOJOT5QqwZG8XZFn9ryBaomNAU3MfGuuLNbcmz+ZMNkgFc4rkn4oC98beaNXDUt6/g0p5Fm4tZMs6qMkYvC4dWEcFY5UY6Y2BP8jv/82dQegvdOyCFZ5VYjlzT9zOvbZME5DlROqfiK+oyKjVtg2P35JW8Q1zPJZ/qDn74EWTSHxMZkBt9KIgKIsQGik8GDl3RVZqlmH65Jv7S0bcjqClckeWxFbQM8nCYDrIxfDAL3Tpj5nkSBFIiN8p7PNa7FaghwIe2J

Matrix Operations

  1. Addition: (A + B)ᵢⱼ = Aᵢⱼ + Bᵢⱼ (same dimensions required)
  2. Scalar multiplication: (cA)ᵢⱼ = c·Aᵢⱼ
  3. fgCWK4I7EmEZqtgBRJT8ap7FTGOn60Mxowy/y6sESlnbyWysvHiUSEGADkExlYuOAopSV9kkT6xHocJT1bsUj0zsH14hbZaNUNA1waxq02yz9XvaNaDGs7oU83V80d/a3XcEsyjY3LDodTUQysHLjT7yVOXCiWD7EQO+1sLK3/lg7WN9TJvky5+cMgwvy3K7yT+ZoZumYfWQHbY/dfxJ4ezZkOFJJwP1Tzah8ekgb+vgYDps0kYVfVRpnp7k7pYf5S1eZKw2vdzmKKTznkNIeaa3w4Yy3oFzWrypXvBqZAM49zhlNiQ5H+2nPrXmLp+IW4LoSRcAN/kvSa4x/L2BQ+obITZ72hc38KKkbOAlsV95e8Jnd4V5Cvu+R7YsWxd4hecNl9XeswBQ+810/tZZD84UbiS4FvpPMnLVJnjiXoHzTNla34ZxAT759GjF2PuAN3hf+cTTO+89hhUIJjcsH6UTHkior7fb+RHS4Qgh1gMlH/4w7UN+FTSD4dOZQmq7y2Q1AVzYEwjVCNbO/7vzt7N41Sex0lZjjPff3xexWvpRsBlaB9/h9RziiP6/pG1B551+9w8YvJD73JltvfoXu9gOPrZoTK7l1yuWPvV23KraAOjSBuak3wuT09lXXQ0cIM8/439Xd5Xau8qD6bt47YPTsWH9mZ6XwSWHiV6Divlnk074Uc+wj6/57QNHCuc+jxKe8UuFxYwdE1eRVmtJka83mvaar4a457E6XfOcmkMZMamhFWk6av
  4. Matrix multiplication: (AB)ᵢⱼ = Σₖ Aᵢₖ·Bₖⱼ (inner dimensions must match)
  5. Inverse: AA⁻¹ = A⁻¹A = I (only for square, non-singular matrices)
Matrix multiplication is not commutative: AB ≠ BA in general. However, it is associative: (AB)C = A(BC). This algebraic structure is fundamental to linear transformations.

Solving Linear Systems

Linear systems Ax = b can be solved via:

  • Gaussian elimination: Row reduce to echelon form
  • Matrix inverse: x = A⁻¹b (when A is invertible)
  • Cramer's rule: xᵢ = det(Aᵢ)/det(A)

These generalize the methods for solving systems of equations to any number of variables. In regression analysis, the normal equations have the form b = (XᵀX)⁻¹Xᵀy.

Determinants

2×2: det(A) = ad − bc
3×3: cofactor expansion along any row/column
Properties: det(AB) = det(A)·det(B), det(Aᵀ) = det(A)

The determinant encodes whether a matrix is invertible (det ≠ 0), the scaling factor of the associated transformation, and the signed volume of the parallelepiped formed by column vectors.

Jqpz4YtMY37h3EYfBy0HYb6NdvFaNyahiJ7C5LvFLvORZ+BJmpex3N9+igN1BkY8ZdgFIHrkue2PRfYQ+P87haIJ2MaBWQH2AMX82Vg1r1PZh1d57vyKe30CpIZ3awCd7RlKcQJP1wwFSA80Dlx+ALlEU4CGMx8+5nFU/St8ZD6AiCpJjrrPKf6x/3Ul8/3RUHT1zGYc/YKQspql3TO7lohiuN2ybVjT2yhU0Aql5FRxZTYwOc0nb3la1cFlWPlu7X89SkYNPF5w+vGrJR5EfFLqz6nKgA8QELSZ0TgpP5XWZ/xOkrje6Bpwdz9/Q4hRYDE6LSMKOj+vYPLLAVwozjDpLZOZfmEZ6+kFDbYcYSljS7hGObuiS68HEs9NR+SR1mxehmIRgoiDKmsse4XfuzzJjRap2d9e82Vdk7N32fn8gu1eUPrFOZgNcMW3tMw2pNwTvmLv9jWPlO5wdQ7y85MtD3GYBPcpm4MPfGzDjtUbyndJOImdzfPRN6849zqziGh6h9kB2Qj9gpCq9q5/sjYkbMv853XbgYeB22jxu4f9w75OcSE47Tq4T+Qz45JM0Hc1z6btVTZ3n/3RvXFDY3CqtfkpSY4Vv3i4iSMvLm62G1BCCFtgow+/ofcRvRkbNpFEeeZdGjXEYCQvAoJP2UeT9dm2J+EmQHJdJjqG4GJin8bDlxJA3CZc5jWNBkQEIx1agmPlpmTooggQiRTjSmKVTzlwc/sndEBwJRNcVE/AcJBTQS6dd