๐ง โIn the heart of every machine learning model lies a matrix. And NumPy makes that matrix manageable.โ
Welcome to Chapter 11 of your NumPy learning journey!
So far, youโve worked with arrays, filtered, reshaped, sorted, and simulated randomness. Now weโre diving into the brain of scientific computing: Linear Algebra โ the foundation for:
-
Machine learning algorithms
-
Physics simulations
-
Engineering calculations
-
Data transformations
-
Neural networks
Luckily, NumPy provides a powerful linear algebra module called np.linalg
, which helps you work with matrices, equations, and decompositions.
๐ฏ What Youโll Learn:
-
Matrix multiplication using
dot()
andmatmul()
-
Finding the inverse, transpose, and determinant
-
Calculating eigenvalues and eigenvectors
-
Solving systems of linear equations
Letโs turn numbers into vectors and vectors into insight. ๐
๐งฑ 1. Matrix Multiplication with dot()
and matmul()
Letโs start with the basics.
๐น Create Two Matrices:
๐ธ Using np.dot()
:
Output:
๐ธ Using np.matmul()
(or @
operator):
result = np.matmul(A, B) # or simply: result = A @ B
๐ 2. Matrix Transpose
To swap rows and columns:
Output:
Useful in:
-
Gram matrices
-
Inner product checks
-
Data alignment
๐ 3. Matrix Inverse with np.linalg.inv()
Only square matrices with non-zero determinant can be inverted.
Output:
Check by multiplying:
print(np.dot(A, inv_A)) # Should return identity matrix
๐ 4. Determinant with np.linalg.det()
The determinant tells you if a matrix is invertible.
Output:
If det == 0
, the matrix is singular (non-invertible).
๐ 5. Eigenvalues and Eigenvectors
Eigenvalues and eigenvectors are used in:
-
Principal Component Analysis (PCA)
-
Quantum mechanics
-
Structural engineering
-
System stability
You get:
-
eigvals
: scalar values (ฮป) such that Ax = ฮปx -
eigvecs
: vectors that remain in the same direction when transformed by A
๐ง 6. Solving Systems of Linear Equations
Letโs say we have the following system:
Convert to matrix form:
Where:
Solve with np.linalg.solve()
:
Output:
๐ฆ Bonus: Pseudo-Inverse with np.linalg.pinv()
Used when the matrix is not square or not invertible.
Useful in:
-
Linear regression
-
Over-determined systems (more equations than unknowns)
๐ Real-Life Applications of Linear Algebra in NumPy
Task | Function |
---|---|
Multiply weight matrix in neural nets | @ , dot() , matmul() |
Normalize or project vectors | eig() , inv() |
Solve equation systems | solve() |
Perform PCA | eig() , cov() |
Simulate physical forces | matmul() |
3D Transformations | dot() and matrix rotation matrices |
โ ๏ธ Common Mistakes to Avoid
Mistake | Correction |
---|---|
Trying to invert non-square matrix | Only square matrices are invertible |
Using inv() to solve equations |
Prefer solve() โ itโs faster and more stable |
Ignoring transpose in multiplication | Use .T when aligning matrix shapes |
Confusing dot() with element-wise product |
Use * for element-wise, dot() for matrix |
๐งพ Summary Table: Linear Algebra Essentials
Function | Description |
---|---|
dot() , matmul() , @ |
Matrix multiplication |
T or transpose() |
Transpose of matrix |
inv() |
Matrix inverse |
det() |
Matrix determinant |
eig() |
Eigenvalues and vectors |
solve() |
Solve AX = B |
pinv() |
Pseudo-inverse |
๐ Wrapping Up Chapter 11
Bravo! ๐ฅณ Youโve entered the world of linear algebra, where numbers gain structure and logic becomes geometry.
With these tools, you can:
-
Build the mathematical backbone of ML models
-
Analyze real-world phenomena
-
Solve complex systems
-
Transform and compress data
From vector spaces to PCA, linear algebra in NumPy is your ticket to professional-level data science.
๐งญ Whatโs Next?
Now that you’ve completed the NumPy Foundations Series, your next adventures might include:
-
Pandas for data manipulation
-
Matplotlib / Seaborn for visualization
-
SciPy for scientific computing
-
scikit-learn for machine learning