In This Lesson Vectors Matrices Matrix Operations Solving Linear Systems Determinants Vectors A vector is an ordered list of numbers representing magnitude and direction. In ℝⁿ, vectors can represent points, displacements, or abstract data.
Vector: v = (v₁, v₂, …, vₙ)
Magnitude: ‖v‖ = √(v₁² + v₂² + … + vₙ²)
Dot product: u·v = u₁v₁ + u₂v₂ + … + uₙvₙ
cos θ = (u·v)/(‖u‖·‖v‖)
The magnitude formula generalizes the distance formula from coordinate geometry. The dot product connects to trigonometric functions via the angle between vectors.
Matrices A matrix is a rectangular array of numbers. An m×n matrix has m rows and n columns.
Identity matrix: I (1s on diagonal, 0s elsewhere)
Transpose: (Aᵀ)ᵢⱼ = Aⱼᵢ
Symmetric: A = Aᵀ
Special matrices: diagonal, upper/lower triangular, symmetric, orthogonal (QᵀQ = I). The identity matrix I acts like 1 in matrix multiplication — similar to how 1 is the multiplicative identity .
Matrix Operations Addition: (A + B)ᵢⱼ = Aᵢⱼ + Bᵢⱼ (same dimensions required)R4IBz+tAubSRL7CUp/FIK9hY36xQaGs8NV0hIAW4OQuFp3/iVZHpVPjNwv+07NswVEdZaYNFkxJwSbslOgQnvi08XkGG0eBtTt46YA+0o5YnkkxzNA5ZX0f1xqvWq2rPyDm5K6bSb48HIjEwQMogo9qHo+UVlhKWNNDJXrGpJDpOFv+OoyjAeQ0wSqAwBQx6E4STRN1aKGMnX7OjhFRe7nCD0ShZOxcsPcg8c7S6PHrzhRca0cvfc2jw5i93oN6tgSbWlDzgG9h2zkQzbpc5YHWLj8kaZhvz8jRRACWNdpIHi3MkU/IbP/rMEvAyGaO4Ymi/DnFoz45LAXYwNpUsKKLo9OZEhSwNaEdZisF0F1G782HICkgf4FZDUYaCIfc0BPZ7tHTazv+ADZiFzLF7Eg6YNuKCtDGvmeqZQqk5ioEaP52AYqIH6xNt4n3JIsQ+DfhdUna4YuKYEIyPdGjktgZ9GPoeCKlTXjHh7pCmxADycdPK8pTDdcL0ckiHTkcX188Vy69tmSs4cYSX6OJx5gUnTOaTqkvlT3YHmohs1gXVBfVG3O2YjmeOsUGorA6cxDv3Tj1ggEdTNqIRCRELs76SEgVZXey06lgdyuatMP3CR3+ghzJfCwjriIUWu3E2KGu7CA+M6BYAmyFAEqxUSi9UKfgrbEBn3h9ky5T28hLYLBCl+tbnuQHfjpdI2sS5n4Cv//Ik4cNX7I/X9lRMEVJpWxcnGw9h9FPzhG8PKsqWO41s4Ff+ut9xDua8 Scalar multiplication: (cA)ᵢⱼ = c·Aᵢⱼ Matrix multiplication: (AB)ᵢⱼ = Σₖ Aᵢₖ·Bₖⱼ (inner dimensions must match) Inverse: AA⁻¹ = A⁻¹A = I (only for square, non-singular matrices) z60YSbrSxsI2iXZXlrqYkCal8UnLw3SCTDoMkdMePez0tQSFHha4zyn3v49qN4OQ0kBW9c+nWpGC+Jk3KTwGiu8fmA19HxhkeMDnahZP25hOf7dhLnYynFkSKVwR9o1XWYvlJCgC42Bp+5J6/SynWuugVHYiOcmrrggXXDTLZY+gBmh9A+BP/fjWizgKcE5C3NusAGxDdkBmPyKv910I706oCOnewbPPJtV4YAOiqti2tMi0AC4Bqo+EcFLZoZ/afpCxC85B+QhmMD5+hJqgFTR2m8EttiLf1vrUEcuuqEb1rDaulhqTCEEq+dwJU8OKur0Mz7PZJedayGlNwV0JkMcPd3VCfiavKPHnclfTeEhqS2QOgEJAwHnqlW4ujPE5cly6iHkeDF82cvmI0H5XPe61d60o81exDpKAqY3UU0oMzQnfbqLeGNF5c14yR2z66UoL/HqxPZdY+T1pjaMQ48mkZjEpDuG+aovdxsyMSQU6ieHBmn4v2+N1Cn7QzroYoQVm4k8CUVZOLUdyiH19hqEBeNoQ07Y08bQyKV0Dh1gC69POln6hNz4Pa27lDk5YW2Kw5l+2Lm5S9D5aComvcYZdWx8LcTuWqw6aj354W4b7dMSEvQnzdg3OKXxSegI/apQzB3yq0h+yXa51FuwcoXked+4yIdmxJJNA2nhDGhKAh2h6031E+smJIkzxih9kIrm7+BlJP9tYqG6uC1z2kyJx5BajjEiOpj4whLYY0cA/ReFmdudKbf+uxTJ
Matrix multiplication is
not commutative : AB ≠ BA in general. However, it is associative: (AB)C = A(BC). This algebraic structure is fundamental to
linear transformations .
Solving Linear Systems Linear systems Ax = b can be solved via:
Gaussian elimination: Row reduce to echelon form Matrix inverse: x = A⁻¹b (when A is invertible) Cramer's rule: xᵢ = det(Aᵢ)/det(A) These generalize the methods for solving systems of equations to any number of variables. In regression analysis , the normal equations have the form b = (XᵀX)⁻¹Xᵀy .
ys0our7S5aerU9Jjf/R++cHjpfF5qolXHIsR1xtY4Z49dBzk7y7YigQZ+NaMb0mtIdUbfsTKQtcoCzbjwaj29avJiyoBHsDggWO4dcliiGZnqvXjbz/U4+lPHkVz4sLpO57dHa3XALuECoCBhHxuM+42kNzI2NsUQmmMQsmmZ4k5teGbWNZQFkhDnVzOtRUhDN948RULUozA5AZZfSCjNxPA6+8YExEmwpug8b89+PK3i751WxudMvyaRlbEZCsZo8VI8EzqsYoocvIvThPJEDPt5xZaPBedekzCxLV2lzsHf05TuVYNLdMS46qsCrQf4dKuW83yH3nmUaowHU1UXney30mkwvhDajtOVmxGwg+JBBbK8IHxzghaRrUGtQxrRLXaT3EsbDHIeiN5ilHGrglihbYUGEsTHHj7+EdOqPR3l6ZZ3djhNzbsZvKc8f1663YaKLgbRpNRHUdUzYPp95EQnrmB69hJzZF8WIFzEPqFveDTJvnIsLbfDllZEkTH1wEWBbLS9nglv9kzNSIgYQFEAZKAOdfbm7EdUiKLX5qxTD+2tQFlrAShS+n8tyNiO3eednAPRY1ziKZXuTXw28G168bACJFWJHK0zZxlzynekQxTIX8vHWuV1PIN/OWCCkt1QAd+GTfCYDN3JF1YWL55OZggUuR49QbL78x0ZKAr1KkQ8gMrxB1kz6zW+UxTzobYm5WxEt4MQR/KwWvXnlERybr3CHAGYFN9Pek2e3veqX61Mq2ba5zEltS Determinants mnYIkT/2jG/KRItqxpaZ6Ut6k3KeZ1uSeYpLmfFAv9NeF9IIi7vrNX+O4W4/SXf0vx1eNy+Pvy/5jRk0LGBC7QCoaORn8TelmLOS/yem6+m3RvCxfSv7EFU+g7JG+PInLppWgaL8sUqvUOZh5qpmk0uO/7/h2fWrDbkaQ01/MBVvd9a5W4ZWoP4NIFOWC6XVYeQv2oZ0gtTEhKT6gCxytQB5Z9FZb7wZarQRrtIvTu39+Pi0Z6GQDc0O39+feS0L6h6fAr2wSW1hktVkLDtWXyT9r0OYkFcwTJeSlf6Dm6WJVpftNifqcm0c+8ASuSJz7zwLron5DjpDab6Rp8zv0TuV0crulJ8golxxM+I+74vN2umsklnf+BcplwSfa7Eo34Wf9ANt3f/vt6+Qn+svC8a0TWBIYWv2w2+VgfUMRC7YAREqLGtUHZCWvW/FLvFiBumH1rIYdc3iC/ClF/qlH9JGzwuY+OpCH6fzp6KUDz+KP8vpIl4MnVgZjSBau79PLEbXrAaQEZvgF2VzDaZoUK/hmyKfNK46F6wbYKPH12LX9YrGFIQ8ODKjDMsCL+pWFfPLF82p3gOz85KsrnVy0TwDpt9dYJCtGLze1MGlDWYz1R6z5Gm7MhNHIc1/2/Tmyldqqvw+LWqejC6VmqSmPYFr7y6JDPSu1aGeluxnRc6l4lXO+QsgZBvS/4qD4j2dMEmA907ms6PwKf5mVtNmKruUb6SQnfBKsKjfhOyVLx4aW9/JhlPTHWjf6OT
2×2: det(A) = ad − bc
3×3: cofactor expansion along any row/column
Properties: det(AB) = det(A)·det(B), det(Aᵀ) = det(A)
The determinant encodes whether a matrix is invertible (det ≠ 0), the scaling factor of the associated transformation , and the signed volume of the parallelepiped formed by column vectors.