The Little Book of Linear Algebra - Summary This resource provides a comprehensive yet beginner-friendly introduction to the core concepts of linear algebra, structured into chapters that cover fundamental topics from vectors to applications in practice. --- Chapter 1: Vectors Scalars and Vectors: Scalars are real numbers; vectors are ordered collections of scalars in \(\mathbb{R}^n\). Vector Addition and Scalar Multiplication: Addition and scaling are defined component-wise; vectors can be combined linearly. Dot Product, Norms, and Angles: Dot product: \(\mathbf{u} \cdot \mathbf{v} = \sum ui vi\). Norm (length): \(|\mathbf{v}| = \sqrt{\mathbf{v}\cdot \mathbf{v}}\). Angle between vectors given by \(\cos\theta = \frac{\mathbf{u}\cdot \mathbf{v}}{|\mathbf{u}||\mathbf{v}|}\). Orthogonality: Two vectors are orthogonal if their dot product is zero. Orthogonal sets and orthonormal sets are defined. Projection of a vector onto another and orthogonal decomposition. --- Chapter 2: Matrices Definition and Notation: Matrices are rectangular arrays of scalars; vectors are special cases. Matrix Operations: Addition, scalar multiplication, and crucially matrix multiplication. Transpose and Inverse: Transpose \(A^T\) swaps rows and columns. Inverse \(A^{-1}\) satisfies \(AA^{-1} = I\) if exists. Special Matrices: Identity, diagonal, permutation, symmetric, skew-symmetric, orthogonal matrices. Geometric interpretations like rotations and reflections. --- Chapter 3: Systems of Linear Equations Systems can be represented as \(A\mathbf{x} = \mathbf{b}\). Types of solutions: none, unique, infinite. Matrix form facilitates computational solving. Gaussian Elimination: Pivot operations lead to row echelon form. Back substitution finds solutions. Rank and Consistency: Rank = number of pivots, measures independent equations. System consistent if \(\text{rank}(A) = \text{rank}(A|\mathbf{b})\). Homogeneous Systems: Zero vector is always a solution. Nontrivial solutions exist if \(\text{rank}(A) < n\). --- Chapter 4: Vector Spaces Abstract definition of vector spaces with axioms related to addition and scalar multiplication. Examples include \(\mathbb{R}^n\), polynomials, functions. Subspaces: Defined as subsets closed under addition and scalar multiplication, must include zero vector. Span, Basis, Dimension: Span: all linear combinations of a set. Basis: linearly independent set spanning the space. Dimension: number of vectors in any basis. Coordinates: Relative to a chosen basis, vectors have unique coordinate representations. Change of basis explained via change-of-basis matrices. --- Chapter 5: Linear Transformations Preserve addition and scalar multiplication. Represented by matrices once bases are fixed. Kernel (null space) and image (range) are key subspaces associated with transformations. Change of basis alters matrix representation by similarity transformation. --- Chapter 6: Determinants Scalar value associated with square matrices. Measures volume scaling, invertibility, and orientation. Explicit formulas for 2x2 and 3x3 matrices, generalized by cofactor expansion. Properties include multiplicativity, behavior under row operations, and relation to trace and invertibility. Applications include volume calculation, invertibility testing, and solution formulas like Cramer's Rule. --- Chapter 7: Inner Product Spaces Inner products generalize dot product, defining length and angle in abstract vector spaces. Norm derived from the inner product measures vector length. Orthogonal projections minimize distance to subspaces. Gram–Schmidt process constructs orthonormal bases. Orthonormal bases simplify computations and represent ideal coordinate systems. --- Chapter 8: Eigenvalues and