Skip to content

Linear Algebra: Vectors, Matrices, and Transforms

Linear Algebra: Vectors, Matrices, and Transforms hero image
Modified:
Published:

Linear algebra is the most useful branch of mathematics you will ever learn. That is not an exaggeration. Every time an engineer analyzes a circuit, a roboticist programs a joint, a graphics programmer renders a scene, or a data scientist trains a model, they are using linear algebra. It is the common language of applied mathematics, and once you speak it, entire fields open up. #LinearAlgebra #Engineering #AppliedMath

Vectors: Arrows with Meaning

A vector is a quantity that has both magnitude and direction. Your phone’s accelerometer, for example, outputs a 3D vector: it measures gravity (always pointing down) plus the phone’s own acceleration. Subtracting the gravity vector from the reading gives you just the phone’s motion, which is how your phone knows the difference between “tilted” and “shaken.”

Forces, velocities, electric fields, and displacements are all vectors. Think of a vector in 2D as an instruction: “go 3 steps east and 4 steps north.” We write this as:

The magnitude (length) is , and the direction is from east.

Vector Addition: Combining Effects

When two forces act on an object, the result is their vector sum. If a wind pushes a boat east at 3 m/s and a current pushes it north at 4 m/s, the boat moves northeast at 5 m/s.

4 | /|
| / |
| / | b = (0, 4)
| / |
| / a+b|
|/ |
0 +------+---
0 a=(3,0) 3
a + b = (3, 4), magnitude = 5

Algebraically, you just add the components:

This is why vectors are so powerful: complex combinations of forces, velocities, or fields reduce to simple component-wise arithmetic.

The Dot Product: How Aligned Are Two Vectors?

The dot product of two vectors and is:

The dot product measures how much two vectors point in the same direction.

  • If (parallel), : maximum dot product.
  • If (perpendicular), : zero dot product.
  • If (opposite), : negative dot product.

Engineering use: Work. When you push a box with force through displacement , the work done is . Only the component of force along the direction of motion does work. Push at an angle, and you waste effort.

Engineering use: Projection. The projection of vector onto vector is:

This is how you decompose a force into components along and perpendicular to a surface, which is the first step in most structural analysis problems.

The Cross Product: Perpendicular Results

The cross product of two 3D vectors gives a new vector perpendicular to both inputs:

The magnitude is , which equals the area of the parallelogram formed by the two vectors.

Engineering use: Torque. Torque is the cross product of the position vector and the force vector: . The torque is perpendicular to both the lever arm and the force, which is why a wrench turns a bolt in the direction it does.

Engineering use: Surface normals. In 3D graphics and finite element analysis, you find the normal to a surface by taking the cross product of two vectors that lie in that surface.

Matrices: Machines That Transform Vectors



A matrix is not just a grid of numbers. A matrix is a machine that takes a vector as input and produces a new vector as output. When you multiply a matrix by a vector, you are applying a transformation: rotation, scaling, reflection, shearing, or projection.

The matrix transforms the input vector into the output vector .

What Matrices Do

A diagonal matrix scales each component independently:

This stretches the x-component by 2 and the y-component by 3. In engineering, this is how you apply different gains to different channels.

Worked Example: 2D Rotation

Rotate the point by 45 degrees counterclockwise.

The point has been rotated. Its distance from the origin is preserved (). Rotation matrices always preserve distances. That is a defining property.

Matrix Multiplication as Function Composition

If matrix rotates by 30 degrees and matrix scales by 2, then rotates by 30 degrees and then scales by 2. The combined operation is a single matrix, the product .

Note the order: means “apply first, then .” Matrix multiplication is not commutative ( in general), but it is associative. This is why matrix multiplication is defined the way it is. It is function composition.

In robotics, this is how you chain joint transformations. Each joint has a rotation matrix, and the total transformation from base to end-effector is the product of all the individual matrices.

Systems of Linear Equations: Ax = b



The most common use of linear algebra in engineering is solving systems of linear equations. Consider a circuit with two loops analyzed using Kirchhoff’s voltage law:

Rearranging:

In matrix form:

This is , where is the coefficient matrix, is the unknown vector (currents), and is the source vector (voltages). GPS works the same way: your phone receives distances from multiple satellites and solves a system of equations to find your position. Trilateration is just with geometry.

Solving Ax = b

If the matrix is invertible (its determinant is nonzero), the system has a unique solution:

For our circuit example:

The determinant is nonzero, so a unique solution exists.

So and .

When Does Ax = b Have a Solution?

det(A) != 0

Unique solution exists. The matrix is invertible. The system has exactly one answer. This is the usual case in well-posed engineering problems.

det(A) = 0

No unique solution. Either there are infinitely many solutions (the equations are redundant) or no solution (the equations are contradictory). Physically, this means your system is under-constrained or over-constrained.

Eigenvalues and Eigenvectors



An eigenvector of a matrix is a special vector that, when transformed by , only gets scaled (not rotated):

The scalar is the eigenvalue. The eigenvector points in a direction that the transformation preserves.

Why Eigenvalues Matter

Google’s PageRank algorithm, which decides the order of your search results, is fundamentally an eigenvalue problem. The “importance” of each webpage is the dominant eigenvector of a matrix representing the link structure of the entire web.

Eigenvalues appear everywhere in engineering because they describe the natural behavior of systems:

Vibration modes. A bridge, a guitar string, and an electronic circuit all have natural frequencies. These are the eigenvalues of the system’s dynamic matrix. Each eigenvalue gives a frequency, and its eigenvector gives the shape of the vibration at that frequency.

Stability analysis. A control system is stable if all eigenvalues of its state matrix have negative real parts. If any eigenvalue has a positive real part, the system is unstable and will blow up. This is the single most important test in control theory.

Principal stress directions. In structural analysis, the eigenvalues of the stress tensor give the principal stresses, and the eigenvectors give the directions they act in. This tells you where a structure is most likely to fail.

Finding Eigenvalues

Eigenvalues satisfy . For a 2x2 matrix:

The eigenvalues are and .

For : gives , so .

For : gives , so .

Python: NumPy for Linear Algebra



NumPy makes all of this practical. Here is a complete example that covers the key operations.

import numpy as np
# --- Vectors ---
a = np.array([3, 4])
b = np.array([1, 2])
print("Vector addition:", a + b)
print("Dot product:", np.dot(a, b))
print("Magnitude of a:", np.linalg.norm(a))
# Cross product (3D)
a3 = np.array([1, 0, 0])
b3 = np.array([0, 1, 0])
print("Cross product:", np.cross(a3, b3))
# --- 2D Rotation ---
theta = np.radians(45)
R = np.array([
[np.cos(theta), -np.sin(theta)],
[np.sin(theta), np.cos(theta)]
])
point = np.array([3, 1])
rotated = R @ point
print(f"\n({point[0]}, {point[1]}) rotated 45 degrees: ({rotated[0]:.3f}, {rotated[1]:.3f})")
# --- Solving Ax = b (circuit example) ---
A = np.array([
[5, -2],
[-2, 7]
])
b_vec = np.array([10, 0])
x = np.linalg.solve(A, b_vec)
print(f"\nCircuit currents: I1 = {x[0]:.3f} A, I2 = {x[1]:.3f} A")
print(f"Determinant: {np.linalg.det(A):.1f}")
# --- Eigenvalues ---
M = np.array([
[4, 1],
[2, 3]
])
eigenvalues, eigenvectors = np.linalg.eig(M)
print(f"\nEigenvalues: {eigenvalues}")
print(f"Eigenvectors (columns):\n{eigenvectors}")
# Verify: A @ v = lambda * v
for i in range(len(eigenvalues)):
v = eigenvectors[:, i]
lam = eigenvalues[i]
Av = M @ v
lam_v = lam * v
print(f" A @ v{i+1} = {Av}, lambda * v{i+1} = {lam_v}")

A Note on np.linalg.solve vs. Inverse

Always use np.linalg.solve(A, b) instead of np.linalg.inv(A) @ b. The solve function is faster, more numerically stable, and works even for large, poorly conditioned systems. Computing the inverse explicitly is almost never necessary in practice.

Application: Force Balance in a Truss



Consider a simple truss with two members meeting at a pin joint, with an external load applied:

P (pin joint)
/|\
/ | \
/ | \
/ F | \
/ (down) \
/ | \
A------+------B
(ground supports)

The equilibrium condition at point P gives two equations (horizontal and vertical force balance):

In matrix form:

This is just again. The same linear algebra that solves circuits also solves structural problems. That universality is the real power of the framework.

import numpy as np
# Truss with members at 60 and 120 degrees from horizontal
alpha = np.radians(60)
beta = np.radians(120)
F = 1000 # Newtons (downward load)
A = np.array([
[np.cos(alpha), np.cos(beta)],
[np.sin(alpha), np.sin(beta)]
])
b = np.array([0, F])
T = np.linalg.solve(A, b)
print(f"Member 1 tension: {T[0]:.1f} N")
print(f"Member 2 tension: {T[1]:.1f} N")

Summary



Linear algebra gives you a unified framework for problems that might look very different on the surface.

ConceptWhat it meansWhere it appears
Vector additionCombining effectsForces, velocities, fields
Dot productAlignment, projectionWork, signal correlation
Cross productPerpendicular resultTorque, surface normals
Matrix-vector productTransformationRotation, scaling, coordinate change
Ax = bSystem of equationsCircuit analysis, force balance
DeterminantInvertibility testDoes a unique solution exist?
EigenvaluesNatural modesVibration, stability, principal stress

The next lesson on complex numbers and phasors will use vectors in the complex plane, building directly on the vector intuition developed here.



Comments

Loading comments...
© 2021-2026 SiliconWit®. All rights reserved.