Unit 1: Vector Algebra and Matrices

Table of Contents

Vector Products

Scalar (Dot) Product

The scalar product of two vectors A and B is a scalar quantity defined as:

A · B = |A| |B| cos(θ)

where |A| and |B| are the magnitudes of the vectors and θ is the angle between them.

Vector (Cross) Product

The vector product of two vectors A and B is a vector quantity C defined as:

A × B = |A| |B| sin(θ) n̂

where n̂ is a unit vector perpendicular to the plane containing A and B, given by the Right-Hand Rule.


Scalar and Vector Triple Products

Scalar Triple Product (Box Product)

It is defined as the dot product of one vector with the cross product of the other two: A · (B × C).

Vector Triple Product

It is defined as the cross product of one vector with the cross product of the other two: A × (B × C).

Common Mistake: Confusing the scalar and vector triple products. Remember:

Properties and Applications of Vectors

This is a summary of the applications discussed above.

S
Concept Formula Physical Application
Scalar Product W = F · d Calculating Work Done by a force.
Scalar Product P = F · v Calculating instantaneous Power.
Scalar Product ΦB = B · A Calculating Magnetic Flux through an area A.
Vector Product τ = r × F Calculating Torque.
Vector Product L = r × p Calculating Angular Momentum.
Vector Product Fm = q(v × B)Calculating Lorentz Force on a charge in a B-field.
Scalar Triple Product V = |A · (B × C)| Finding the Volume of a parallelepiped.

Scalar and Vector Fields

A field is a physical quantity that has a value for each point in space and time.


Matrices: Types and Properties

Different Types of Matrices

Symmetric and Antisymmetric Matrices

Hermitian and Anti-Hermitian Matrices

These are the complex-number equivalents of symmetric and antisymmetric matrices.

Key Property: Any square matrix A can be written as the sum of a symmetric (S) and an antisymmetric (K) matrix: A = S + K
where S = (1/2)(A + Aᵀ) and K = (1/2)(A - Aᵀ).
Similarly, any square matrix A can be written as the sum of a Hermitian (H) and an anti-Hermitian (S) matrix: A = H + S
where H = (1/2)(A + A†) and S = (1/2)(A - A†).

Matrix Operations: Inverse and Transpose

Transpose of a Matrix (Aᵀ)

As defined earlier, (Aᵀ)ij = Aji.

Inverse of a Matrix (A⁻¹)

The inverse of a square matrix A is a matrix A⁻¹ such that A A⁻¹ = A⁻¹ A = I (the identity matrix).


Solution of Simultaneous Linear Equations

A system of linear equations can be written in matrix form as AX = B.

    [ a₁₁ a₁₂ ... a₁n ] [ x₁ ]   [ b₁ ]
    [ a₂₁ a₂₂ ... a₂n ] [ x₂ ] = [ b₂ ]
    [ ... ... ... ... ] [ .. ]   [ .. ]
    [ aₘ₁ aₘ₂ ... aₘn ] [ xn ]   [ bₘ ]
        

Homogeneous Equations (AX = 0)

This is when B is a null vector (all bi = 0).

Non-Homogeneous Equations (AX = B)

This is when B is not a null vector.

Methods of Solution:

  1. Matrix Inverse Method:
    • If A is square and non-singular (det(A) ≠ 0), a unique solution exists.
    • Multiply by A⁻¹: A⁻¹(AX) = A⁻¹B → (A⁻¹A)X = A⁻¹B → IX = A⁻¹B
    • Solution: X = A⁻¹B
  2. Gauss-Jordan Elimination (Row Reduction):
    • Form the augmented matrix [A | B].
    • Use elementary row operations (swapping rows, multiplying a row by a scalar, adding a multiple of one row to another) to transform A into the identity matrix I.
    • The augmented matrix will become [I | X], and the right-hand column will be the solution vector X.
  3. Cramer's Rule:
    • Used when det(A) ≠ 0.
    • The solution for each variable xi is given by: xi = det(Ai) / det(A)
    • where Ai is the matrix A with its i-th column replaced by the vector B.

Eigenvalues and Eigenvectors of a Matrix

Definition

For a given square matrix A, a non-zero vector X is an eigenvector of A if it satisfies the following equation for some scalar λ:

AX = λX

The scalar λ is called the eigenvalue corresponding to the eigenvector X. This means that when matrix A "acts" on its eigenvector X, it only scales it by a factor λ, without changing its direction.

How to Find Eigenvalues and Eigenvectors

  1. Step 1: Find the Eigenvalues (λ)
    • Rearrange the equation: AX - λX = 0 → AX - λIX = 0
    • (A - λI)X = 0
    • This is a homogeneous system of linear equations. For a non-trivial solution (X ≠ 0) to exist, the determinant of the coefficient matrix must be zero.
    • The Characteristic Equation: det(A - λI) = 0
    • Solving this equation (which will be a polynomial in λ) gives the eigenvalues λ₁, λ₂, ...
  2. Step 2: Find the Eigenvectors (X)
    • For each eigenvalue λi found in Step 1, substitute it back into the equation: (A - λiI)X = 0
    • Solve this homogeneous system for the vector X = [x₁, x₂, ...]. This will typically involve using Gaussian elimination.
    • The solution will have at least one free variable, leading to an infinite number of solutions (all scalar multiples of the base eigenvector). We usually state one normalized eigenvector.
Properties of Eigenvalues: