Elementary Linear Algebra is a fundamental branch of mathematics that focuses on the study of vectors, vector spaces, linear transformations, and systems of linear equations. It provides essential tools and concepts such as matrices, determinants, eigenvalues, and eigenvectors, which are widely used to solve problems across various disciplines including engineering, physics, computer science, and economics. Understanding these linear algebraic structures’ definitions, functions, and properties enables efficient modeling and analysis of complex real-world scenarios. Master elementary linear algebra is important for success in numerous competitive examinations like the **GRE**, **GATE**, **IIT-JEE**, and other university entrance tests, where proficiency is often assessed through conceptual questions and solved examples that show practical applications of linear algebraic principles.

Table of Contents

## What is Elementary Linear Algebra?

Elementary Linear Algebra is a foundational area of mathematics that deals with the study of linear equations, linear functions, and their representations through matrices and vector spaces. It provides the basic framework for understanding and solving systems of linear equations, which are crucial in various fields like engineering, physics, computer science, economics, and more.

Here is a table of important terms and concepts.

Particular | Description |

Vector | A mathematical object that represents both magnitude (size) and direction. It can be visualized as an arrow in space. |

Matrix | A rectangular array of numbers or symbols arranged in rows and columns. |

Linear Equation | An equation that expresses a linear relationship between variables. It can be written in the form Ax + By + Cz = D, where A, B, C, and D are constants. |

System of Linear Equations | A collection of two or more linear equations with the same variables. |

Determinant | A scalar value is associated with a square matrix. It provides information about the invertibility of the matrix. |

Inverse Matrix | The multiplicative inverse of a square matrix. If A is a square matrix, its inverse, denoted by A⁻¹, satisfies the condition AA⁻¹ = A⁻¹A = I, where I is the identity matrix. |

Eigenvalues and Eigenvectors | For a square matrix, eigenvalues are scalar values that satisfy the equation Ax = λx, where λ is the eigenvalue and x is the corresponding eigenvector. Eigenvalues and eigenvectors are essential in many applications, such as diagonalization and principal component analysis. |

Row Operations | Elementary operations are performed on the rows of a matrix to simplify its structure. These operations include interchanging rows, multiplying a row by a nonzero scalar, and adding a multiple of one row to another row. |

Gaussian Elimination | A method for solving systems of linear equations by transforming the augmented matrix into row echelon form or reduced row echelon form using row operations. |

**Also Read: ****Variables and Constants**

## Functions of Elementary Linear Algebra

Elementary Linear Algebra encompasses several important functions that are fundamental to understanding and solving linear systems, manipulating matrices, and analyzing vector spaces. These functions are crucial for various applications in mathematics, physics, engineering, computer science, and economics. Here are some key functions of Elementary Linear Algebra:

**1. Solving Systems of Linear Equations**

**Function**: Solving a set of linear equations is one of the primary functions of linear algebra. Given a system in the form:

**Ax = b**

where A is the coefficient matrix, x is the vector of unknowns, and b is the vector of constants, linear algebra provides methods such as Gaussian elimination, matrix inversion, and LU decomposition to find the vector x.

**2. Matrix Operations**

**Function**: Linear algebra defines and uses various matrix operations, including addition, multiplication, transposition, and inversion. These operations are used to manipulate data and solve problems in multiple dimensions.**Addition**: C=A+B**Multiplication**: C=AB**Transpose**: A^{T}**Inverse**: A^{-1}, such that AA^{-1}=I

**3. Determinants and Their Properties**

**Function**: The determinant of a matrix provides essential information about the matrix, such as whether it is invertible, the volume scaling factor of a transformation, and the orientation of vectors in space.

**det(A)**

- Properties of determinants include:
- det(AB) = det(A)det(B)
- det(A
^{T}) = det(A)

**4. Eigenvalues and Eigenvectors**

**Function**: Eigenvalues and eigenvectors are used to analyze the properties of matrices, particularly in linear transformations. They play a crucial role in stability analysis, vibrations, quantum mechanics, and facial recognition algorithms.- Eigenvalue equation:

**Av = λv**

where λ is an eigenvalue and v is the corresponding eigenvector.

**5. Vector Spaces and Subspaces**

**Function**: Linear algebra defines vector spaces and subspaces, which are sets of vectors that can be scaled and added together. These concepts are fundamental in understanding dimensions, linear combinations, and the basis of vector spaces.**Span**of vectors: Span{v1,v2,…,v_{n}}**Basis**: A set of vectors that span the vector space and are linearly independent.**Dimension**: The number of vectors in the basis for the space.

**6. Linear Transformations**

**Function**: Linear transformations are functions that map vectors from one vector space to another while preserving vector addition and scalar multiplication. They are represented by matrices and are used in areas such as computer graphics, differential equations, and optimization.- If T is a linear transformation, then:

**T(u+v)=T(u)+T(v)**

**T(cv)=cT(v)**

**7. Orthogonality and Projections**

**Function**: Orthogonality and projections are important in the study of vector spaces. Orthogonal vectors have a zero inner product, and projections are used to find the component of one vector along another.**Orthogonal vectors**: ⟨u,v⟩=0**Projection**of v onto u:

**Proj**_{u}**v = {⟨v,u⟩/⟨u,u⟩}u**

**8. Norms and Distances**

**Function**: The norm of a vector measures its length or magnitude, and the distance between two vectors can be calculated using norms. These concepts are used in optimization, machine learning, and statistics.**Norm**: ∥v∥ = √⟨v,v⟩**Distance**between u and v

**d(u,v) = ∥u−v∥**

## Formulas for Elementary Linear Algebra

Elementary Linear Algebra involves a variety of formulas that are essential for solving problems related to vectors, matrices, and linear transformations. Here are some of the key formulas:

**Vectors**

**Addition:****u**+**v**= (u₁ + v₁, u₂ + v₂, …, uₙ + vₙ)**Scalar Multiplication:**c**v**= (cv₁, cv₂, …, cvₙ)**Dot Product:****u**·**v**= u₁v₁ + u₂v₂ + … + uₙvₙ**Norm (Magnitude):**||**v**|| = √(v₁² + v₂² + … + vₙ²)**Angle between vectors:**cos θ = (**u**·**v**) / (||**u**|| ||**v**||)

**Matrices**

**Addition:**A + B = [aᵢⱼ + bᵢⱼ]**Scalar Multiplication:**cA = [caᵢⱼ]**Matrix Multiplication:**AB = [∑(aᵢₖ x bₖⱼ)] (where the summation is over k)**Transpose:**Aᵀ = [aⱼᵢ]**Determinant:**det(A) (various methods exist, such as cofactor expansion or row reduction)**Inverse:**A⁻¹ (if it exists) satisfies AA⁻¹ = A⁻¹A = I (where I is the identity matrix)**Rank:**rank(A) (the maximum number of linearly independent rows or columns)

**Systems of Linear Equations**

**Matrix Representation:**AX = B**Cramer’s Rule:**If det(A) ≠ 0, then xᵢ = det(Aᵢ) / det(A), where Aᵢ is the matrix formed by replacing the i-th column of A with B.

**Eigenvalues and Eigenvectors**

**Characteristic equation:**det(A – λI) = 0**Eigenvalue equation:**Av = λv

**Linear Transformations**

**Matrix Representation:**T(x) = Ax**Change of Basis:**[T]B’ = P⁻¹[T]BP, where P is the transition matrix from B to B’.

**Orthogonal Matrices**

**Definition:**A matrix A is orthogonal if AᵀA = AAᵀ = I.**Properties:**||Ax|| = ||x||, (Ax) · (Ay) = x · y

**Diagonalization**

**Diagonalizable matrix:**A matrix A is diagonalizable if it can be written as A = PDP⁻¹, where D is a diagonal matrix and P is an invertible matrix.

**Quadratic Forms**

**Standard form:**Q(x) = xᵀAx, where A is a symmetric matrix.**Principal axis theorem:**A quadratic form can be diagonalized by an orthogonal change of variables.

**Inner Product Spaces**

**Definition:**A vector space V with an inner product is called an inner product space.**Properties:**(u, v) = (v, u), (cu, v) = c(u, v), (u + v, w) = (u, w) + (v, w), (u, u) ≥ 0 and (u, u) = 0 if and only if u = 0.

**Gram-Schmidt Orthogonalization**

**Process:**Given a basis {v₁, v₂, …, vₙ}, construct an orthonormal basis {u₁, u₂, …, uₙ} using the Gram-Schmidt process.

**Least Squares Approximation**

**Solution:**The least squares solution to Ax = b is the vector x̂ that minimizes ||Ax – b||².**Normal equations:**AᵀAx = Aᵀb

**Singular Value Decomposition (SVD)**

**Decomposition:**A = UΣVᵀ, where U and V are orthogonal matrices and Σ is a diagonal matrix.

## Properties of Elementary Linear Algebra

Elementary linear algebra is a fundamental branch of mathematics with various properties and relationships between its objects (vectors, matrices, and linear equations). Here are some important properties of elementary linear algebra:

### Properties of Vectors

**Addition:**- Commutative:
**u**+**v**=**v**+**u** - Associative:
**u**+ (**v**+**w**) = (**u**+**v**) +**w** - Identity element: There exists a zero vector
**0**such that**u**+**0**=**u**for all**u**. - Inverse element: For every vector
**u**, there exists an additive inverse –**u**such that**u**+ (-**u**) =**0**.

- Commutative:
**Scalar multiplication:**- Distributive over vector addition: c(
**u**+**v**) = c**u**+ c**v** - Distributive over scalar addition: (c + d)
**u**= c**u**+ d**u** - Associative: (cd)
**u**= c(d**u**) - Identity element: 1
**u**=**u**

- Distributive over vector addition: c(

### Properties of Matrices

**Addition:**- Commutative: A + B = B + A
- Associative: A + (B + C) = (A + B) + C
- Identity element: There exists a zero matrix O such that A + O = A for all A.
- Inverse element: For every square matrix A with a nonzero determinant, there exists an inverse matrix A⁻¹ such that AA⁻¹ = A⁻¹A = I.

**Multiplication:**- Associative: (AB)C = A(BC)
- Distributive over addition: A(B + C) = AB + AC and (A + B)C = AC + BC
- Identity element: There exists an identity matrix I such that AI = IA = A for all A.

**Transpose:**- (A + B)ᵀ = Aᵀ + Bᵀ
- (cA)ᵀ = cAᵀ
- (AB)ᵀ = BᵀAᵀ

### Properties of Linear Equations

**Equivalence:**Two systems of linear equations are equivalent if they have the same solution set.**Consistency:**A system of linear equations is consistent if it has at least one solution.**Inconsistency:**A system of linear equations is inconsistent if it has no solutions.**Uniqueness:**A system of linear equations has a unique solution if it is consistent and has exactly one solution.

**Also Read: ****Heights and Distances**

## Solved Examples of Elementary Linear Algebra

Here are five solved examples illustrating different concepts in Elementary Linear Algebra:

**Example 1: Solving a System of Linear Equations**

**Problem:** Solve the following system of linear equations:

x + 2y – z = 3

2x – y + 3z = -1

3x + y + z = 5

**Solution:** We can solve this system using Gaussian elimination.

**1. Augmented matrix:**[1 2 -1 | 3]

[2 -1 3 | -1]

[3 1 1 | 5]

**2. Row operations:**

- R2 = R2 – 2R1
- R3 = R3 – 3R1
- R3 = R3 + R2

[1 2 -1 | 3]

[0 -5 5 | -7]

[0 0 0 | 0]

3. **Back-substitution:**

- z = t (a free variable)
- -5y + 5z = -7 => -5y + 5t = -7 => y = (7 – 5t) / 5
- x + 2y – z = 3 => x + 2(7 – 5t) / 5 – t = 3 => x = (1 – 3t) / 5

**Solution:** x = (1 – 3t) / 5, y = (7 – 5t) / 5, z = t, where t is any real number.

**Example 2: Finding Eigenvalues and Eigenvectors**

**Problem:** Find the eigenvalues and eigenvectors of the matrix:

A = [2 1] [1 2]

**Solution:**

**Characteristic equation:**det(A – λI) = 0

=> det([2 – λ 1] [1 2 – λ]) = 0

=> (2 – λ)² – 1 = 0

=> λ² – 4λ + 3 = 0

=> (λ – 1)(λ – 3) = 0

**Eigenvalues:**λ₁ = 1, λ₂ = 3**Eigenvectors:**- For λ₁ = 1: (A – λ₁I)x = 0 => [1 1] [1 1] x = 0 => x₁ + x₂ = 0 => x₂ = -x₁ => Eigenvector:
**v₁**= [1 -1]ᵀ - For λ₂ = 3: (A – λ₂I)x = 0 => [-1 1] [1 -1] x = 0 => -x₁ + x₂ = 0 => x₂ = x₁ => Eigenvector:
**v₂**= [1 1]ᵀ

- For λ₁ = 1: (A – λ₁I)x = 0 => [1 1] [1 1] x = 0 => x₁ + x₂ = 0 => x₂ = -x₁ => Eigenvector:

**Example 3: Diagonalization**

**Problem:** Diagonalize the matrix A from Example 2.

**Solution:**

Since we have found the eigenvalues and eigenvectors, we can directly form the diagonal matrix D and the invertible matrix P.

- D = [1 0] [0 3]
- P = [1 1] [-1 1]

**Diagonalization:** A = PDP⁻¹

**Example 4: Orthogonal Matrices**

**Problem:** Determine if the following matrix is orthogonal:

A = [1/√2 -1/√2] [1/√2 1/√2]

**Solution:**

A matrix A is orthogonal if AᵀA = I.

AᵀA = [1/√2 1/√2] [-1/√2 1/√2] [1/√2 -1/√2] [1/√2 1/√2]

= [1 0]

[0 1]

Since AᵀA = I, A is an orthogonal matrix.

**Example 5: Least Squares Approximation**

**Problem:** Find the least squares line of best fit for the points (1, 2), (2, 3), and (3, 4).

**Solution:**

**Form the matrix A and the vector b:**A = [1 1] [1 2] [1 3] b = [2] [3] [4]**Solve the normal equations:**AᵀAx = Aᵀb

=> [3 6] [6 14] x = [9] [17]

Solving this system, we get x₁ = 1 and x₂ = 1.

**Least squares line:**y = x₁ + x₂x = 1 + x

## FAQs

**How many types of linear algebra are there?**There are three main types of linear algebra: Elementary Linear Algebra, Advanced Linear Algebra, and Computational Linear Algebra.

**What is mat 242?**

Matrixes, linear equation systems, determinants, vector spaces, linear transformations, and eigenvalues are all introduced in this course.

**Who is the father of linear algebra?**Carl Friedrich Gauss is often considered the father of linear algebra. His work on solving systems of linear equations and introducing the concept of matrices laid the foundation for this branch of mathematics.

**What are the 4 spaces in linear algebra?**The four fundamental spaces in linear algebra are:

Column space (Col A): The span of the columns of a matrix A.

Row space (Row A): The span of the rows of a matrix A.

Null space (Nul A): The set of all solutions to the homogeneous equation Ax = 0.

Left null space (Nul Aᵀ): The set of all solutions to the homogeneous equation Aᵀx = 0.

**What are the 4 spaces in linear algebra?**

The four fundamental spaces in linear algebra are:

**RELATED BLOGS**

Division of Algebraic Expressions | Multiplication of Algebraic Expressions |

Reciprocal | Subtraction of Algebraic Expressions |

Addition of Algebraic Expressions | Pipes and Cisterns |

This was all about the “**Elementary Linear Algebra**”. For more such informative blogs, check out our Study Material Section, or you can learn more about us by visiting our Indian exams page.