## Definition of Determinant

The determinant of a matrix is a scalar value that is associated with a square matrix. It is denoted by **|A|** and is read as the determinant of A.

### Remarks

- For matrix A, |A| is read as determinant of A and not modulus of A.
- Only square matrices have determinants.

## Determinant of a Matrix of Order One

The determinant of a matrix of order 1 is simply the single element present in the matrix. If you have a matrix A of order 1:

**$A=[a]$**

Then, the determinant of A, denoted as **det(A)** or **|A|**, is equal to the single element 'a'. Mathematically, this can be expressed as:

**$det(A)=∣A∣=a$**

## Determinant of a Matrix of Order Two

A matrix of order two is a 2x2 matrix with four elements. The determinant of a matrix of order two can be calculated using the following formula:

## Determinant of a Matrix of Order 3x3

A matrix of order 3x3 is a matrix with three rows and three columns. The determinant of a matrix of order 3x3 can be calculated using the following formula:

|A| = a_{11}(a_{22}a_{33} – a_{32}a_{23}) – a_{12}(a_{21}a_{33} – a_{31}a_{23}) + a_{13}(a_{21}a_{32} – a_{31}a_{22})

## Area of a Triangle

The determinant can also be used to calculate the area of a triangle. The formula for calculating the area of a triangle using determinants is:

The area of a triangle in a 2D plane formed by three vertices **A$(x,y)$**, **B$(x,y)$**, and **C$(x,y)$** can be calculated using the determinant of a matrix. The formula for the area 🔺️ **ABC** is given by:

**Area = 1/2 × |A|**

### Remarks

- Since area is a positive quantity, we always take the absolute value of the determinant.
- If the area is given, we can use both positive and negative values of the determinant for calculation.
- The area of the triangle formed by three collinear points is zero.

## Minors and Cofactors

**Minors:**For a square matrix $A$, the minor $M$ is the determinant of the submatrix obtained by deleting the $i$-th row and $j$-th column of $A$.

- For a matrix $A=[a]$, the minor $M$ is given by $M=det(A)$, where $A$ is the matrix obtained by removing the $i$-th row and $j$-th column from $A$.

To find the minor $M$ associated with each element $a$, you remove the $i$-th row and $j$-th column and calculate the determinant of the resulting 2x2 matrix. The formula is:

**$M=det(A)$**

where $A$ is the matrix obtained by removing the $i$-th row and $j$-th column from matrix $A$.

The minors for each element are as follows:

- $M11 =det(A)$, where $A$ is the 2x2 submatrix obtained by removing the first row and first column.
- $M12 =det(A)$, where $A$ is the 2x2 submatrix obtained by removing the first row and second column.
- $M13 =det(A)$, where $A$ is the 2x2 submatrix obtained by removing the first row and third column.
- And so on...

**Cofactors:**The cofactor A_{$_{ij}$}associated with the**$i$-th**row and**$j$-th**column of a matrix $A$ is given by A$=(−1)i+j⋅Mij $.

- It's essentially the minor multiplied by $(−1)i+j$, where
**$i$**is the row index and**$j$**is the column index. - The cofactor matrix $Cof(A)$ is obtained by replacing each element of $A$ with its corresponding cofactor.

Certainly! Let's represent the cofactors $A$ for a 3x3 matrix $A$:

The cofactor matrix **$Cof(A)$** is obtained by replacing each element of $A$ with its corresponding cofactor $Aij $:

Here, the elements $Aij $ are the cofactors associated with the elements $a_{ij}$ of matrix $A$. The calculation of $A$ involves the minor $M$ and the **sign $(−1)i+j$:**

**$A_{ij}=(−1)i+j•M$**

For example:

- $M11 $
- $M12 $
- $M13 $

And so on for all elements in the cofactor matrix.

Minors and cofactors are important concepts related to determinants. They are used in finding the adjoint of a matrix.

## Adjoint of a Matrix

The adjoint of a matrix is obtained by taking the transpose of the matrix of cofactors. The adjoint of a matrix A is denoted as **adj(A)**.

- The adjoint is used in the process of finding the inverse of a matrix.
- For a given $n×n$ matrix
**$A=[a]$**, the adjoint**$adj(A)$**is the transpose of the matrix of cofactors $Cof(A)$. - $)ij $ is obtained by swapping the row and column indices of the corresponding cofactor $)ij $.

Mathematically, the adjoint is defined as:

Here, $A$ represents the cofactor associated with the element $aij $ of matrix $A$.

In summary, the steps to find the adjoint of a matrix are:

- Find the cofactor matrix
**$Cof(A)$**. - Transpose the cofactor matrix to obtain
**$adj(A)$**.

The adjoint is useful in the formula for finding the inverse of a matrix:

## Singular matrix

A singular matrix is a square matrix whose determinant is zero, meaning it lacks a matrix inverse. In mathematical terms, for an $n×n$ matrix $A$, it is singular if and only if $det(A)=0$.

Mathematically, a matrix $A$ is singular if:

**$Ais singular⟺det(A)=0$**

Here's an example of a 2x2 singular matrix:

To check if it is singular, calculate its determinant:

**$det(A)=(2×2)−(4×1)=4−4=0$**

Since the determinant is zero, matrix $A$ is singular. The absence of a matrix inverse is a characteristic of singular matrices.

## Theorem 1

If A is any given square matrix of order n, then **A(adj A)** =** (adj A)A** = **AI**, where I is the identity matrix of order n.

L.H.S

Since sum of product of elements of a row (or a column) with corresponding

cofactors is equal to **|A|** and otherwise zero, we have

Finally, we can express this as **$det(A)⋅I$**:

$=I Proved.$

## Theorem 2:

## If A is a square matrix of order n, then **|adj(A)|** = **|A|**^{(n-1)}.

^{(n-1)}

Consider a 3x3 matrix $A$ with elements $a$:

**Calculate the adjoint matrix ($adj(A)$):**

- Find the cofactors $A$ for each element $a$.
- Transpose the matrix of cofactors to get $adj(A)$.

Taking Determinant on both sides :

In general, if A is a square matrix of order n, then **|adj(A)| = |A| ^{n – 1}**

.

## Theorem 3

A square matrix A is invertible if and only if A is a nonsingular matrix.

**Proof:**

**Part 1: If $A$ is invertible, then $A$ is nonsingular.**

Suppose $A$ is invertible. This means there exists a matrix $B$ such that

**$AB=BA=I$**, where $I$ is the identity matrix.

Let's consider the product $AA$. Since $A$ is invertible, we can substitute $B$ for $A$:

**$AA=AB=I$**

Similarly, consider the product $AA$:

**$AA=BA=I$**

This implies that $A$ is nonsingular, as the product of $A$ with its inverse yields the identity matrix.

**Part 2: If $A$ is nonsingular, then $A$ is invertible.**

Suppose $A$ is nonsingular. This means that the determinant of $A$, denoted as $det(A)$, is non-zero (det(A)≠0).

Since det(A)≠0 , $A$ is invertible. The existence of the inverse is guaranteed by the fact that the determinant is not zero.

Thus, we have proved both directions of the statement.

Therefore, we conclude that a square matrix $A$ is invertible if and only if $A$ is a nonsingular matrix.

## Inverse of a Matrix

The inverse of a matrix is denoted as A^{-1}. It is a matrix that, when multiplied by the original matrix, gives the identity matrix.

## Applications of Determinants and Matrices: Solution of System of Linear Equations using Inverse of a Matrix

**Step 1: Represent the System in Matrix Form**

Given system of equations:

**$ax+by+cz=d$**

**$ax+by+cz=d$ **

$d3 $

**Step 2: Check if $A$ is Invertible**

Calculate the determinant of $A$, denoted as $det(A)$. If **det(A)≠0**, proceed with finding the inverse. If **$det(A)=0$**, the system may not have a unique solution.

**Step 3: Premultiply by the Inverse of $A$**

To isolate $X$, premultiply both sides of the equation by the inverse of matrix $A$ (if it exists):

**A⋅X=B**

**$A⋅A⋅X=A⋅B$**

Since **$A⋅A$** is the identity matrix **($I$)**, the equation simplifies to:

**$I⋅X=A⋅B$**

**$X=A⋅B$**

This gives you the solution vector $X$, which contains the values for $x$, $y$, and $z$ in the system of equations.