Let

*A*be a square matrix. An**LU factorization**refers to the factorization of A, with proper row and/or column orderings or permutations, into two factors, a lower triangular matrix*L*and an upper triangular matrix*U*,
In the lower triangular matrix all elements above the diagonal are zero, in the upper triangular matrix, all the elements below the diagonal are zero. For example, for a 3-by-3 matrix

*A*, its LU decomposition looks like this:
Without a proper ordering or permutations in the matrix, the factorization may fail to materialize. For example, it is easy to verify (by expanding the matrix multiplication) that . If , then at least one of and has to be zero, which implies either

*L*or*U*is singular. This is impossible if*A*is nonsingular. This is a procedural problem. It can be removed by simply reordering the rows of*A*so that the first element of the permuted matrix is nonzero. The same problem in subsequent factorization steps can be removed the same way, see the basic procedure below.
It turns out that a proper permutation in rows (or columns) is sufficient for the LU factorization. The

**LU factorization with Partial Pivoting**refers often to the LU factorization with row permutations only,
where

*L*and*U*are again lower and upper triangular matrices, and*P*is a permutation matrix which, when left-multiplied to*A*, reorders the rows of*A*. It turns out that all square matrices can be factorized in this form,^{[2]}and the factorization is numerically stable in practice.^{[3]}This makes LUP decomposition a useful technique in practice.
An

**LU factorization with full pivoting**involves both row and column permutations,
where

*L*,*U*and*P*are defined as before, and*Q*is a permutation matrix that reorders the columns of*A*.^{[4]}
An

**LDU decomposition**is a decomposition of the form
where

*D*is a diagonal matrix and*L*and*U*are*unit*triangular matrices, meaning that all the entries on the diagonals of*L*and*U*are one.
Above we required that

*A*be a square matrix, but these decompositions can all be generalized to rectangular matrices as well. In that case,*L*and*P*are square matrices which each have the same number of rows as*A*, while*U*is exactly the same shape as*A*.*Upper triangular*should be interpreted as having only zero entries below the main diagonal, which starts at the upper left corner.###
**
Output of this program **

/* OUTPUT OF MAIN METHODE */++++++++++++++++ Start Q2(a) Answer ++++++++++++++++++++++++

Enter the order of matrix [ <= 10 ]:

3

Enter the uper triangular matrix elements

Enter all coefficients of matrix :

Row 1

2

3

4

Row 2

0

2

8

Row 3

0

0

3

Enter elements of b matrix

16

18

18

Set of result is :

x[1] = 18.5

x[2] = -15

x[3] = 6

++++++++++++++++ End Q2(a) Answer ++++++++++++++++++++++++

++++++++++++++++ Start Q2(b) Answer ++++++++++++++++++++++++

Enter the order of matrix [ <= 10 ]:

3

Enter the lower triangular matrix elements

Enter all coefficients of matrix :

Row 1

2

0

0

Row 2

3

4

0

Row 3

2

3

5

Enter elements of b matrix

18

16

20

Set of result is :

x[1] = 9

x[2] = -2.75

x[3] = 2.2

++++++++++++++++ End Q2(b) Answer ++++++++++++++++++++++++

++++++++++++++++ Start Q4 Answer +++++++++++++++++++++++

Enter the order of matrix [ <= 10 ]: 4

Enter all coefficients of matrix :

Row 1

2

1

-1

2

Row 2

4

5

-3

6

Row 3

-2

5

-2

6

Row 4

4

11

-4

8

Enter elements of b matrix

5

9

4

2

L matrix is

1 0 0 0

2 1 0 0

-1 2 1 0

2 3 -1 1

U matrix is

2 1 -1 2

0 3 -1 2

0 0 -1 4

0 0 0 2

Set of result is :

x[1] = 1

x[2] = -2

x[3] = 1

x[4] = 3

++++++++++++++++ End Q4 Answer ++++++++++++++++++++++++

Press any key to continue . . .

## 0 comments :

## Post a Comment