π§ βIn the heart of every machine learning model lies a matrix. And NumPy makes that matrix manageable.β
Welcome to Chapter 11 of your NumPy learning journey!
So far, youβve worked with arrays, filtered, reshaped, sorted, and simulated randomness. Now weβre diving into the brain of scientific computing: Linear Algebra β the foundation for:
-
Machine learning algorithms
-
Physics simulations
-
Engineering calculations
-
Data transformations
-
Neural networks
Luckily, NumPy provides a powerful linear algebra module called np.linalg
, which helps you work with matrices, equations, and decompositions.
π― What Youβll Learn:
-
Matrix multiplication using
dot()
andmatmul()
-
Finding the inverse, transpose, and determinant
-
Calculating eigenvalues and eigenvectors
-
Solving systems of linear equations
Letβs turn numbers into vectors and vectors into insight. π
π§± 1. Matrix Multiplication with dot()
and matmul()
Letβs start with the basics.
πΉ Create Two Matrices:
πΈ Using np.dot()
:
Output:
πΈ Using np.matmul()
(or @
operator):
result = np.matmul(A, B) # or simply: result = A @ B
π 2. Matrix Transpose
To swap rows and columns:
Output:
Useful in:
-
Gram matrices
-
Inner product checks
-
Data alignment
π 3. Matrix Inverse with np.linalg.inv()
Only square matrices with non-zero determinant can be inverted.
Output:
Check by multiplying:
print(np.dot(A, inv_A)) # Should return identity matrix
π 4. Determinant with np.linalg.det()
The determinant tells you if a matrix is invertible.
Output:
If det == 0
, the matrix is singular (non-invertible).
π 5. Eigenvalues and Eigenvectors
Eigenvalues and eigenvectors are used in:
-
Principal Component Analysis (PCA)
-
Quantum mechanics
-
Structural engineering
-
System stability
You get:
-
eigvals
: scalar values (Ξ») such that Ax = Ξ»x -
eigvecs
: vectors that remain in the same direction when transformed by A
π§ 6. Solving Systems of Linear Equations
Letβs say we have the following system:
Convert to matrix form:
Where:
Solve with np.linalg.solve()
:
Output:
π¦ Bonus: Pseudo-Inverse with np.linalg.pinv()
Used when the matrix is not square or not invertible.
Useful in:
-
Linear regression
-
Over-determined systems (more equations than unknowns)
π Real-Life Applications of Linear Algebra in NumPy
Task | Function |
---|---|
Multiply weight matrix in neural nets | @ , dot() , matmul() |
Normalize or project vectors | eig() , inv() |
Solve equation systems | solve() |
Perform PCA | eig() , cov() |
Simulate physical forces | matmul() |
3D Transformations | dot() and matrix rotation matrices |
β οΈ Common Mistakes to Avoid
Mistake | Correction |
---|---|
Trying to invert non-square matrix | Only square matrices are invertible |
Using inv() to solve equations |
Prefer solve() β itβs faster and more stable |
Ignoring transpose in multiplication | Use .T when aligning matrix shapes |
Confusing dot() with element-wise product |
Use * for element-wise, dot() for matrix |
π§Ύ Summary Table: Linear Algebra Essentials
Function | Description |
---|---|
dot() , matmul() , @ |
Matrix multiplication |
T or transpose() |
Transpose of matrix |
inv() |
Matrix inverse |
det() |
Matrix determinant |
eig() |
Eigenvalues and vectors |
solve() |
Solve AX = B |
pinv() |
Pseudo-inverse |
π Wrapping Up Chapter 11
Bravo! π₯³ Youβve entered the world of linear algebra, where numbers gain structure and logic becomes geometry.
With these tools, you can:
-
Build the mathematical backbone of ML models
-
Analyze real-world phenomena
-
Solve complex systems
-
Transform and compress data
From vector spaces to PCA, linear algebra in NumPy is your ticket to professional-level data science.
π§ Whatβs Next?
Now that you’ve completed the NumPy Foundations Series, your next adventures might include:
-
Pandas for data manipulation
-
Matplotlib / Seaborn for visualization
-
SciPy for scientific computing
-
scikit-learn for machine learning