Welcome back to our ongoing journey through Python Foundations for Machine Learning! In the previous article, we explored the basics of Linear Algebra, focusing on vector addition, matrix multiplication, eigenvalues, singular value decomposition, and linear equations. In this article, we’ll continue our exploration with a set of new problems to further solidify your grasp of these fundamental concepts of linear algebra for machine learning.
Problem 1: Vector Subtraction in linear algebra for machine learning
Given two vectors v = [3, 7, 1]
and w = [1, 4, 2]
, write a Python function to compute their subtraction v - w
. Provide both the result and an explanation of the process.
Python code:
def vector_subtraction(v, w):
result = [vi - wi for vi, wi in zip(v, w)]
return result
v = [3, 7, 1]
w = [1, 4, 2]
result = vector_subtraction(v, w)
print("Vector Subtraction:", result)
Problem 2: Matrix Transpose in linear algebra for machine learning
The transpose of a matrix is an operation that swaps its rows and columns, effectively reflecting the matrix along its main diagonal. This transformation results in a new matrix where the rows of the original become columns, and vice versa, providing a fundamental operation used in various mathematical and computational contexts.
Create a 2×3 matrix A
with the following values:
A = [ [4, 8, 2],
[1, 5, 7]
]
Write a Python function to transpose matrix A
.
Python code:
def transpose_matrix(matrix):
result = [[matrix[j][i] for j in range(len(matrix))] for i in range(len(matrix[0]))]
return result
A = [
[4, 8, 2],
[1, 5, 7]
]
transposed_A = transpose_matrix(A)
print("Transposed Matrix A:", transposed_A)
Problem 3: Matrix Inversion
The inverse of a matrix is a mathematical operation that, when applied to a square matrix, yields another matrix that, when multiplied with the original matrix, results in the identity matrix.
- Check for Squareness: Ensure that the matrix is square (having an equal number of rows and columns) since only square matrices have the potential to possess an inverse.
- Compute the Determinant: Calculate the determinant of the matrix. If the determinant is zero, the matrix is singular and does not have an inverse.
- Check for Singularity: If the determinant is non-zero, proceed to find the inverse. If singular, the matrix lacks an inverse.
- Find the Adjoint Matrix: Form the adjoint matrix by taking the transpose of the matrix of cofactors.The cofactor of an element is the signed minor determinant obtained by removing the row and column containing that element.
- Calculate the Inverse: Divide each element of the adjoint matrix by the determinant of the original matrix
Given a 2×2 matrix D
:
D = [
[3, 5],
[1, 2]
]
Write a Python function to find the inverse of matrix D
using NumPy.
Python code:
D = np.array([
[3, 5],
[1, 2]
])
inverse_D = np.linalg.inv(D)
print("Inverse of Matrix D:")
print(inverse_D)
Problem 4: Norm of a Vector
Compute the Euclidean norm of the vector G = [1, 2, 3]
using Python.
Python code:
G = np.array([1, 2, 3])
norm_G = np.linalg.norm(G)
print("Euclidean Norm of Vector G:", norm_G)
Problem 5: Matrix Trace
Calculate the trace of the following matrix H
:
H = [
[4, 2, 1],
[0, 5, 2],
[3, 1, 7]
]
The trace of a matrix is the sum of its diagonal elements. In this case, the trace of matrix H
would be 4 + 5 + 7 = 16
.
Python Code:
H = np.array([
[4, 2, 1],
[0, 5, 2],
[3, 1, 7]
])
trace_H = np.trace(H)
print("Trace of Matrix H:", trace_H)
These problems aim to reinforce your understanding of linear algebra concepts and their application in machine learning using Python. Work through them patiently, and you’ll find yourself more confident in leveraging linear algebra in your machine learning endeavours. Happy coding!