MGDo6gTQSh0DZ3RV324ZQnNQAYDxJz2KD03SjTr/PrGVOPAJ+4UrwhkbUlOliVOygXsKd3WN+YI2bC/gDX6Y8WKQ8fcV9MxH1J/nt5eQM4hxdrkFxn0fJcvxlYJPyC+05MRJA4/SNV4UxV1X6JKrZ37wmNkI0ypzN3TGTsMtSrQjMOga9edp46ErKzmDEtlqCPEh8s3FAfKSOJXpLBD2ShvOsCdNeMJ9DNdB/8tE2kq2uM2g283fB7Y+VuDQjQbRUpjB063eSe2+nXV7AcZs9TOn1t5HAi+2dfAOKz79tyhqTpMHVD6CvV5JJjwRWymWBQ/mD+fKzVctm1vi0nYsX+XdyvoaBLE+cVgANtGieEoWGXGKxzE2MzQquencgySuuaTbKHWl/pIiSlQTEPXZStk313KDbOB9124sP1Rm5I8DrijqY5N45RB45059Jy6AcLHH04FzmS7weLks/vDDg4ARMA/FXS2IsW6MlSP5+JSUFB/ySr4rXIJUbZjT25+nENRMe7VEQo1YQuvxdKc0Bu0DQ3p+UJb8b0QkiVUP0CzQ0bPnmdG4d/2L8M50DzlvnTzI2E2j8GEYqRcUH0D2+8Ek8xKHUL58OE3Q+Udh4k2LtlQvSU0qx01C0KZt5wICb+Pkj+Xlen5MbWNp4ZR9Z2kM8Qmlv6iPKdVAWDSC2BCGmYcJkr8wETm1Rng45i0xSpv+ghVtJYtK5DlF8A+cBOixz7NMdSN/Bv8vx/tTfOstoPIyenM6u

Vectors & Matrices

The fundamental building blocks of linear algebra and modern computation.

Vectors

A vector is an ordered list of numbers representing magnitude and direction. In ℝⁿ, vectors can represent points, displacements, or abstract data.

Vector: v = (v₁, v₂, …, vₙ)
Magnitude: ‖v‖ = √(v₁² + v₂² + … + vₙ²)
Dot product: u·v = u₁v₁ + u₂v₂ + … + uₙvₙ
cos θ = (u·v)/(‖u‖·‖v‖)

The magnitude formula generalizes the distance formula from coordinate geometry. The dot product connects to trigonometric functions via the angle between vectors.

Matrices

A matrix is a rectangular array of numbers. An m×n matrix has m rows and n columns.

Identity matrix: I (1s on diagonal, 0s elsewhere)
Transpose: (Aᵀ)ᵢⱼ = Aⱼᵢ
Symmetric: A = Aᵀ

Special matrices: diagonal, upper/lower triangular, symmetric, orthogonal (QᵀQ = I). The identity matrix I acts like 1 in matrix multiplication — similar to how 1 is the multiplicative identity.

Matrix Operations

JptpuRuGM9C4GZGUxFaBorZE968n0jzc3S8ioP7j8yJVEPjzESHukolyjv4azjJ8BV3nhjkKnWZLP4JCiKPtlQa8eNWO83ZDfZwFsMedyY+HeIA6JFxKY0Y+kpX9uC5nuYziTSUWm8ThdlFL9K7Qqmbz0PVvPOg/p+tlG4zZ+J3VRS7lnBN46uv7CwqRpuDh6VNPjZB8l+BpwAfYdeUL3T9Rde7gszPeG3B+EWU+gdMEU+JdthrP7WAvCxFkOJknYai4e4Oc4a4za0jVBicmHDDmyQsZNEEVIKHonoIri/yxivRETEnhInNxcpR+HdN0WZ2GSaF7sbcCl+POaOS3l+UrEEBmbRMdEbqBdQcPQL9RUue2F+7YxbgQW3E0PGc3A2znvRYEcFiWRY5flUmOj7xbkPifRth/oSXvgRN8hfi09Umj2sHP9g+pP6DcwpfkziXCOJmFxwD2yObuUvmCYk7T9ISgpG7Kd4lgpRR+TxWSAz7o31/AiSMUwbsThKxxJ6AP09LqeRqt3ZOWKzGtVexie48aYqpEEF0Ujp/Ei86azLPWE57XF8jZJC2TyP7FWWAMAHcSGGomv2LjzUF2i41TXnf/nhKpBwRSJgJPCWNijkace2KX7ox4XMQakAAUAtQosP1xit3SAApowh5y/jn+ctASClO7W1wwPEinvjyH8lkc25FjzkftIdMHP7zQ5zh60rlqi4qK4oeerL4xJfbdehJQ2ZijenFUsedY7c6Ib/Uk1z5iL
  1. Addition: (A + B)ᵢⱼ = Aᵢⱼ + Bᵢⱼ (same dimensions required)
  2. ozjTNKAHXscfv8hC+HMv54iHWWVhTTC9Hyd3uPPNtG8JUUXkjqwl5xHQv1B7WT24tk/sBd/Fxqudmb44r/4+dwQifI6s8GikTT+GzEw862RGwxHPKaESsw02DkXjTghUeBm5gUauYJ7cyJVoB1UqNCMrWpLPUeeKWXS31zRPnz4GcdHn3cR9BY5JcV/+DGRxRXW7OmSkbtw+BS9c05cbrQWeCdt4HT0ejTQ2JoDtxLukf62F+0z9stb2w5xCUFvpmSP5UQKOStai+T50esSFFduDwrfhVtCp5iXYmgD31b2PZH7VafAlo+tr2HYSXanURkWASp7qiDLuiAg9xRUaQE7Z27NMKEAZV1SUBWvIRG/Li0U/87H9+Z1pyqW/ThwrGpw88/NwaQ3UromxLT3/6AIS0xmZIe/UcWUJlVcXupOv9ONZ0scEngC2pe++UOmoYkAlDDJF/TmWajLxVgxURGqKicJ4cy6do5WloySdRUuWf0xfuWPjqSU3eT4HOi/pR1Fxq07sZQX2TLM7nxWGp9qzXfXqKXXm2ocbZ4Vb3qtfFinbAGrfylJz8U4+uxM7wxekp5CSbHtFJUKc5iqxrJeuLIWxb6oTjyYGVrDmsoSk6JNcEwD4edZOiem7JVjGKKqMXiMk7f38+OPyIvogWvXmdl16gE3lStAXi+5FCi7DGy+ZiJSYRiGNYgWOtep/KTKlWc82PFOXUJ3XKKrdaB3kvUA4iEPpSpsjKtLxvYPQNZCF/QFwc
  3. Scalar multiplication: (cA)ᵢⱼ = c·Aᵢⱼ
  4. Matrix multiplication: (AB)ᵢⱼ = Σₖ Aᵢₖ·Bₖⱼ (inner dimensions must match)
  5. Inverse: AA⁻¹ = A⁻¹A = I (only for square, non-singular matrices)
Matrix multiplication is not commutative: AB ≠ BA in general. However, it is associative: (AB)C = A(BC). This algebraic structure is fundamental to linear transformations.

Solving Linear Systems

Linear systems Ax = b can be solved via:

  • Gaussian elimination: Row reduce to echelon form
  • Matrix inverse: x = A⁻¹b (when A is invertible)
  • Cramer's rule: xᵢ = det(Aᵢ)/det(A)

These generalize the methods for solving systems of equations to any number of variables. In regression analysis, the normal equations have the form b = (XᵀX)⁻¹Xᵀy.

Determinants

2×2: det(A) = ad − bc
3×3: cofactor expansion along any row/column
Properties: det(AB) = det(A)·det(B), det(Aᵀ) = det(A)

The determinant encodes whether a matrix is invertible (det ≠ 0), the scaling factor of the associated transformation, and the signed volume of the parallelepiped formed by column vectors.

NxiT4zJCMs8C8e7aAgqrUnd8HEb4G9Mzv33mWP7vPPOkw90BEOK050YO8ogQHypQqGqjXiB798JGCF/nt+L6/GHBFOfc88Z9ayw9Bg/LgQ+POKh61+FvO2oOJFoqpkKpZzoQogH57Thifumk/XMKIGgcNQF/nTYA531aOgUmUJwcp2+iHBGEMYpsB8EWLhMvO40PMW+bVU1r4URLg6ac5Hb7khpr9uVGSg7Uy1P2JoVWoVp2kbz1xLYnreKC1vznvXaaqHTfALVrtFPSshEsDrnpUQBYBOIoJidZbziC8No9Ya0D7Y6FvFQ7N4gSTnh1srgeZIgq6+06IfDjjZPO7JzPyo8zlSdHKlNx9vc1Wg+u8Fmstpyz5iuf7REaduH4kYcxBuGwk6eScZXDSdYydTJKaOB/5zMtjrDZJSkrVPZMLnHk6kwvELDSurONg8ihiWnAPCoE33/HMT7JN2moRkGv30dW41/AxaI/qMJMPOg+DbnKWlw83+V/cCFiVOH9pbg503+OW3ASjcGfW1Eh7rOV0FfZfX8Y33RT4qBxTsofDOCwfroX5nJTIr41/v/S7hGWv4qfjn8hlI0gE2Oy+NJS/wZHFs0Qv4XV3J9YQmgLqHZO1rKj7AQ+v8OzPB7sWAMSCJ01t91scRkXnH0acQ1OYD9e1p8/SGQLvGY8xVqB+R4OsmwRVlxRMGyWqLhoe633ovzGAtgjlqSZO+mPT5Aon56EIXeQPasv2C99O2cj5FkXG26J9