Python

πŸ“˜ Chapter 11: Linear Algebra with NumPy β€” Powering Data Science and Machine Learning

🧠 β€œIn the heart of every machine learning model lies a matrix. And NumPy makes that matrix manageable.”

Welcome to Chapter 11 of your NumPy learning journey!

So far, you’ve worked with arrays, filtered, reshaped, sorted, and simulated randomness. Now we’re diving into the brain of scientific computing: Linear Algebra β€” the foundation for:

  • Machine learning algorithms

  • Physics simulations

  • Engineering calculations

  • Data transformations

  • Neural networks

Luckily, NumPy provides a powerful linear algebra module called np.linalg, which helps you work with matrices, equations, and decompositions.


🎯 What You’ll Learn:

  • Matrix multiplication using dot() and matmul()

  • Finding the inverse, transpose, and determinant

  • Calculating eigenvalues and eigenvectors

  • Solving systems of linear equations

Let’s turn numbers into vectors and vectors into insight. πŸ”


🧱 1. Matrix Multiplication with dot() and matmul()

Let’s start with the basics.

πŸ”Ή Create Two Matrices:

import numpy as np

A = np.array([[1, 2],
              [3, 4]])

B = np.array([[5, 6],
              [7, 8]])

πŸ”Έ Using np.dot():

result = np.dot(A, B)
print("Dot Product:\n", result)

Output:

[[19 22]
 [43 50]]

πŸ”Έ Using np.matmul() (or @ operator):

result = np.matmul(A, B)
# or simply:
result = A @ B

πŸ”„ 2. Matrix Transpose

To swap rows and columns:

print("Transpose of A:\n", A.T)

 

Output:

[[1 3]
 [2 4]]

Useful in:

  • Gram matrices

  • Inner product checks

  • Data alignment


πŸ” 3. Matrix Inverse with np.linalg.inv()

Only square matrices with non-zero determinant can be inverted.

inv_A = np.linalg.inv(A)
print("Inverse of A:\n", inv_A)

Output:

[[-2.   1. ]
 [ 1.5 -0.5]]

Check by multiplying:

print(np.dot(A, inv_A))  # Should return identity matrix

πŸ“ 4. Determinant with np.linalg.det()

The determinant tells you if a matrix is invertible.

det_A = np.linalg.det(A)
print("Determinant of A:", det_A)

Output:

-2.0000000000000004

If det == 0, the matrix is singular (non-invertible).


πŸ“ˆ 5. Eigenvalues and Eigenvectors

Eigenvalues and eigenvectors are used in:

  • Principal Component Analysis (PCA)

  • Quantum mechanics

  • Structural engineering

  • System stability

eigvals, eigvecs = np.linalg.eig(A)

print("Eigenvalues:", eigvals)
print("Eigenvectors:\n", eigvecs)

You get:

  • eigvals: scalar values (Ξ») such that Ax = Ξ»x

  • eigvecs: vectors that remain in the same direction when transformed by A


🧠 6. Solving Systems of Linear Equations

Let’s say we have the following system:

2x + 3y = 8  
3x + 4y = 11

Convert to matrix form:

AX = B

Where:

A = np.array([[2, 3],
              [3, 4]])

B = np.array([8, 11])

 

Solve with np.linalg.solve():

solution = np.linalg.solve(A, B)
print("Solution [x, y]:", solution)

Output:

[1. 2.]

πŸ“¦ Bonus: Pseudo-Inverse with np.linalg.pinv()

Used when the matrix is not square or not invertible.

M = np.array([[1, 2, 3], [4, 5, 6]])
pseudo_inv = np.linalg.pinv(M)
print("Pseudo-Inverse:\n", pseudo_inv)

Useful in:

  • Linear regression

  • Over-determined systems (more equations than unknowns)


πŸ” Real-Life Applications of Linear Algebra in NumPy

Task Function
Multiply weight matrix in neural nets @, dot(), matmul()
Normalize or project vectors eig(), inv()
Solve equation systems solve()
Perform PCA eig(), cov()
Simulate physical forces matmul()
3D Transformations dot() and matrix rotation matrices

⚠️ Common Mistakes to Avoid

Mistake Correction
Trying to invert non-square matrix Only square matrices are invertible
Using inv() to solve equations Prefer solve() β€” it’s faster and more stable
Ignoring transpose in multiplication Use .T when aligning matrix shapes
Confusing dot() with element-wise product Use * for element-wise, dot() for matrix

🧾 Summary Table: Linear Algebra Essentials

Function Description
dot(), matmul(), @ Matrix multiplication
T or transpose() Transpose of matrix
inv() Matrix inverse
det() Matrix determinant
eig() Eigenvalues and vectors
solve() Solve AX = B
pinv() Pseudo-inverse

πŸ”š Wrapping Up Chapter 11

Bravo! πŸ₯³ You’ve entered the world of linear algebra, where numbers gain structure and logic becomes geometry.

With these tools, you can:

  • Build the mathematical backbone of ML models

  • Analyze real-world phenomena

  • Solve complex systems

  • Transform and compress data

From vector spaces to PCA, linear algebra in NumPy is your ticket to professional-level data science.


🧭 What’s Next?

Now that you’ve completed the NumPy Foundations Series, your next adventures might include:

  • Pandas for data manipulation

  • Matplotlib / Seaborn for visualization

  • SciPy for scientific computing

  • scikit-learn for machine learning

Leave a Reply

Your email address will not be published. Required fields are marked *