MATRIX AND ITS APPLICATIONS


MATRIX AND ITS APPLICATIONS   

ABSTRACT:   

This Project examines matrices and three of its applications. Matrix theories were used to solve economic problems, which involves methods at which goods can be produced efficiently. To encode and also to decode very sensitive information. This project work also goes further to apply matrices to solve a 3 x 3 linear system of equations using row reduction methods.

TABLE OF CONTENT

Title Page………………..i

Certification………….…ii

Dedication………………iii

Acknowledgement……….iv

Table of content……...…v

CHAPTER ONE

INTRODUCTION AND LITERATURE REVIEW

1.1    BACKGROUND OF THE STUDY

1.2    SCOPE OF STUDY

1.3    SIGNIFICANT OF STUDY

1.4    TYPES OF MATRICES

1.4.1    Row Matrix

1.4.2    Column Matrix

1.4.3    Single Element Matrix

1.4.4    Double Suffix Matrix

1.4.5    Matrix Notation

1.5    SPECIAL MATRICES

1.5.1    Square Matrix

1.5.2    Diagonal Matrix

1.5.3    Unit or Identity Matrix

1.5.4    Null Matrix or Law Matrix

1.5.5    Equal Matrix

1.5.6    Singular Matrix

1.5.7    Triangular Matrix (Echelon Form)

1.5.8    Orthogonal Matrix

1.5.9    Non-Singular or Invertible Matrix

1.5.10    Conjugate of a Matrix

1.5.11    Idempotent Matrix

1.5.12    Periodic Matrix

1.5.13    Nilpotent Matrix

1.5.14    Involuntary Matrix

1.5.15    Transpose  of a Matrix

1.5.16    MATRIX 

1.5.17    Unitary Matrix

CHAPTER TWO

LAWS AND ALGEBRAS OF MATRICES

2.1    Introduction

2.2    Addition of Matrices

2.3    Addition of Matrices

2.4    SUBTRACTION OF MATRICES

CHAPTER THREE

3.1    Each n-square matrix A

DETERMINANT OF ORDER 2

3.1.1        DETERMINANT OF ORDERS 3

3.2        DETERMINANT OF A SQUARE MATRIX    

3.2.1        MINORS AND CO-FACTORS

3.2.2        CLASSICAL ADJOINT OF A MATRIX

3.3        PROPERTIES OF DETERMINANT

CHAPTER FOUR

INVERSE OF A MATRIX SQUARE

4.1        INVERSE OF A MATRIX

4.1.1        INVERSE OF A 2 X 2 MATRIX

4.1.2        INVERSE OF 3 X 3 MATRIX

4.2    Product of a square matrix and it’s inverse from the above

examples, we have seen how to obtain

4.3        INVERSE THEOREM

CHAPTER ONE

INTRODUCTION AND LITERATURE REVIEW

1.1    BACKGROUND OF THE STUDY

    The introduction and development of the notion of a matrix and the subject of linear algebra followed the development of determinants, which arose from the study of coefficients of systems of linear equations. Leibnitz, one of the founder of calculus, used determinant in 1963 and Cramer presented his determinant based formula for solving systems of linear equation (today known as Cramer’s rule) in 1750.

    The first implicit use of matrices occurred in Lagrange’s work on bilinear form in late 1700. Lagrange desired to characterize the maxima and minima of multi-variant functions. His method is now known as the method of Lagrange multipliers. In order to do this he first required the first order partial derivation to be 0 and additionally required that a condition on the matrix of second order partial derivatives holds; this condition is today called positive or negative definiteness, although Lagrange did not use matrices explicitly.

    Gauss developed elimination around 1800 and used it to solve least square problem in celestial computations and later in computations to measure the earth and it’s surface (the branch of applied mathematics concerned with measuring or determining the shape of the earth or with locating exactly points on the earth’s surface is called Geodesy). Even though Gauss name is associated with this technique eliminating variable from system of linear equations there were earlier work on this subject.

    Chinese manuscripts from  several centuries earlier have been found that explains how to solve a system of three equations in three unknown by “Guassian” elimination. For years Gaussian elimination was considered part of the development of geodgesy, not mathematics. The first appearance of Gaussian-Jordan elimination in print was in a handbook on geodesy written by Wihelm Jordan. Many people incorrectly assume that the famous mathematician, Camille Jordan is the Jordan in “Gauss-Jordan elimination”.

    For matrix algebra to fruitfully develop one needed both proper notation and proper definition of matrix multiplication. Both needs were met at about the same time in the same place. In 1848 in England, J.J Sylvester first introduced the term “matrix”, which was the Latin word for “womb” as a name for an array of numbers.

    Matrix algebra was nurtured by the work of Arthur Cayley in 1855. Cayley studied multiplication so that the matrix of coefficient for the composite transformation ST is the product of the matrix S times the matrix T. He went on to study the algebra of these composition including matrix inverses. The famous Cayley-Hamilton theorem which asserts that a square matrix is a root of it’s characteristics’ polynomial was given by Cayley in his 1858 memoir on the theory of matrices. The use of single letter “A to represent a matrix was crucial to the development of matrix algebra. Early in the development, the formular det(AB) = det (A) det(B) provided a connection between matrix algebra and determinants. Cayley wrote “There would be many things to say about this theory of matrices which should, it seems to me, precede the theory of determinants”.

    Mathematicians also attempted to develop for algebra of vectors but there was no natural definition of the product of two vectors that held in arbitrary dimensions. The first vector algebra that involved a non- commutative vector product (that is V x W need not equal W x V) was proposed by Hermann Grassman in his book – Ausedehnungslehre (1844). Grossmann’s text also introduced the product of a column matrix and a row matrix, which resulted in what is now called a simple or a rank one matrix. In the late 19th century the American mathematical physicist, Willard Gibbs published his famous treatise on vector analysis. In that treatise Gibbs represented general matrices, which he called dyadics as sum of simple matrices, which Gibbs called dyads. Later the physicist, P.A.M. Dirac introduced the term “bracket” for what we now call the “scalar product” of a “bar” (row) vector times a “ket” (column) vector and the term “ket-bra” for the product of a ket times a bra, resulting in what we now call a simple matrix, as above. Our convention of identifying column matrices and vector was introduced by physicists in the 20th century.

    Matrices continued to be closely associated with linear transformations. By 1900, they were just a finite dimensional sub case of the emerging theory of linear transformations. The modern definition of a vector space was introduced by Peano in 1888. Abstract vector space whose elements were function soon followed. There was renewed interests in matrices, particularly on the numerical analysis of matrices, John Von Neumann and Herman Goldstein introduced condition numbers in analyzing round – off errors in 1947. Alan Turing and Von Neumann, the 20th century giants in the development of stored – program computers. Turning introduced the LU decomposition of a matrix in 1948. The L is a lower triangular matrix with I’s on the diagonal and the U is an echelon matrix. It is common to use LU decomposition in the solution of n sequence of systems of linear equations, each having the same co-efficient matrix. The benefit of the QR decomposition was realized a decade later. The Q is a matrix whose column are orthogonal vector and R is a square upper triangular invertible matrix with positive entities on its diagonal.

    The QR factorization is used in computer algorithms for various computations, such as solving equations and find eigenvalues. Frobenius in 1878 wrote an important work on matrices on linear substitutions and bilinear forms, although he seemed unaware of Cayley’s work. However be proved important results in canonical matrices as representatives of equivalences classes of matrices. He cites Kronecker and Weiserstrases having considered special cases of his results in 1868 and 1874  respectively.

    Frobenius also proved the general result that a matrix satisfies it’s characteristic equation. This 1878 paper by Frobenius also contains the definition of the rank of a matrix, which he used in his work on canonical forms and the definition of orthogonal matrices.

    An axiomatic definition of a determinant was used by Weierstrass in his lectures and after his death, it was published in 1903 in the note on determinant theory. In the same year Kronecker’s lectures on determinants were also published after his death. With these two publications, the modern theory of  determinants was in place but matrix theory took slightly longer to become a fully accepted theory. An important early text which brought matrices into their proper place within mathematics was introduction to higher algebra by Bocher in 1907. Turnbull and Aitken wrote influential text in the 1930s and Missky’s; “An introduction to linear algebra” in 1955 saw matrix theory to reach its present major role as one of the most important undergraduate mathematics topic. 

1.2    SCOPE OF STUDY

    In this study we are going to focus on m x n matrices of different order, i.e 2 x 3, 3 x2, 3 x3, etc. algebra of matrices, i.e. the different operation of addition, subtraction, scalar multiplication, matrix multiplication (under which we will consider power of matrices) and see if division is defined for matrices, determinant of different order starting with 2,3, etc.

    Also of square matrix (under which we consider cofactors and adjoint) and the different properties of determinant; inverse of square matrix, product of a square matrix and it’s inverse and also special types of square matrix and the different applications of matrices.

1.3    SIGNIFICANT OF STUDY

    Matrices are key tools in linear algebra. One of the uses of matrices is to represent linear transformations, which are higher dimensional analogs of linear functions where matrix multiplication corresponds to composition of linear transformation which is used in computer graphics to project 3- dimensional space onto a 2- dimensional screen.

    A major branch of numerical analysis is devoted to the development of efficient algorithms for matrix computations and for a square matrix, the determinant and inverse matrix (when it exists) govern the behaviour of solution of the corresponding system of the linear equations and eigenvalue and eigenvectors provide insight into the geometry of the associated linear transformation. The study of matrix is applicable to every aspect of human endeavour.   

1.4    TYPES OF MATRICES

1.4.1    Row Matrix

A row matrix consists of 1 column only e.g. (3 2 4) is a row matrix of order 1 x 3.

1.4.2    Column Matrix

A column matrix is matrix having only one column.

e.g.   is a column matrix of order 3 x 1

    So to conserve space in printing, a column matrix is sometimes written on one line with “curly” bracket e.g. (3 2 4) and is the same matrix as order 3 x 1.

1.4.3    Single Element Matrix

A single matrix number may be regarded as a matrix as a [x] matrix is having 1 row and 1 column.

1.4.4    Double Suffix Matrix

Each element in a matrix has its own particular address or location which can be defined by a system of double suffixes, the first indicating the row and the second the column, thus:

  indicates element in the third row and element in the second column.

1.4.5    Matrix Notation

    A whole matrix can be denoted by a single general element enclosed in brackets, or by a single letter printed in bold types. This is a very neat shorthand and saves much space, for example:

  can be denoted by  or (a) or by A

1.5    SPECIAL MATRICES

1.5.1    Square Matrix

    A square matrix is a matrix of order m x m meaning of the same number of rows and columns. e.g.

    1    2   5

    6    8   9    is a 3 x 3 matrix

    1    7   4

A square matrix (aij) is symmetric of aij = aji

    1    2   5            1    2     5

    2    8   9      =        2    8     9

    5    9   4            5    9     4

i.e. it is symmetrical about the leading diagonal.

Note: A = A

    A square matrix (aij) is skew-symmetric of aij = -aji e.g.

    1     2    5                -1   -2   -5

    2     8    9        =       -    -2   -8   -9

    5     9    4                 -5  -9   -4

in this case A = -AT

1.5.2    Diagonal Matrix

    A square matrix is called a diagonal matrix, if all its non-diagonal element are zero e.g.

1     0    0

0     3    0

0     0    4

1.5.3    Unit or Identity Matrix

    A square matrix is called a unit matrix if all the diagonal elements are unity and non-diagonal elements are zero e.g.

(i)    1     0    0

    0     1    0

    0     0    1

(ii)     1  0

     0  1

1.5.4    Null Matrix or Law Matrix

    Any where in which the elements are zero is called a Zero Matrix or Null Matrix. e.g.

0      0      0

0      0      0

0      0      0

1.5.5    Equal Matrix

    Two matrixes are said to be equal if:

(i)    They are of the same order

(ii)    The elements in the corresponding position are equal.

    Thus of     A   =      2     3                B =    2     3

                  1     4                         1      4

                            A=B

1.5.6    Singular Matrix

    If the determinant of a matrix is zero, then the matrix is known as singular matrix e.g. 

           then A is a singular matrix.

1.5.7    Triangular Matrix (Echelon Form)

    A square matrix, all of whose element below the leading diagonals are zero is called an upper triangular matrix. A square matrix, all of whose elements above the leading diagonal are zero, is called a lower triangular matrix. e.g.

    1    3   2                    1   0   0

    0    4   1                    4   1   0

    0    0   0                    6   8   5

Upper triangular matrix           Lower triangular matrix

1.5.8    Orthogonal Matrix

    A square matrix A is called an orthogonal matrix, if the product of the matrix A and the transpose matrix A1 or ‘A’ is a unity matrix e.g.  

    A.AT = I

    If 

1.5.9    Non-Singular or Invertible Matrix

    A matrix A is called non-singular matrix if its inverse exist.

How to get the inverse of matrix A:

    To get the inverse of matrix A the following rules must be observed or followed:

1.    Interchange the two elements on the diagonal.

2.    Take the negative of the other two elements.

3.    Multiply the resulting matrix by   or equivalently, divide each element by  .  In case   = O, the matrix A is not vertible or non singular.

Expressing the process of inverting matrixes as a rule we do the following:

i.    We get the minors of the matrix.

ii.    We sign the minors with the rule (-1) i+j to obtain the cofactors.

iii.    We transpose the cofactors

iv.    Multiply the result with the reciprocal of the determinant of the original matrix i.e. 

Using matrix  

where   CT  =  B   Adj  A  =  CT

Consider A  =  

1.5.10    Conjugate of a Matrix

    Let       

then the conjugate of  matrix A is Ă

      Ă

1.5.11    Idempotent Matrix

    A matrix, such that A2 = A is called an idempotent matrix e.g.

1.5.12    Periodic Matrix

    A matrix A will be called a periodic matrix, if AK+1 = A where K is a positive integer. If K is the least positive integer, for which AK+1 = A, then K is said to be periodic of A. if we choose K=1 we get A2 = A and we call it to be idempotent matrix.

1.5.13    Nilpotent Matrix

    A matrix will be called a nilpotent matrix, if AK = 0 (null matrix) where K is a positive integer, if however

is the least positive integer of which AK = 0, then K is the index of the nilpotent matrix.

A =  ab    b2      ,   A2  =  ab    b2      ab    b2      =     0   0    = 0

       -a2   -ab                   -a2  -ab    -a2   -ab             0   0  

1.5.14    Involuntary Matrix

    A matrix A will be called an involuntary matrix, if A2 = I (unit matrix) since I2 = I always, it therefore means that a unit matrix is an involuntary matrix.

1.5.15    Transpose  of a Matrix

    In a given matrix A, we interchange the rows, and the corresponding columns, the new matrix obtained is called the transpose of matrix A and is denoted by A0 and A1 ­­e.g.

1.5.16    MATRIX 

    The transpose of the conjugate matrix A is denoted by 

1.5.17    Unitary Matrix

    A square matrix A is said to be unitary of AT A = I

Proof:   

Where p is called the conjugate of matrix A.

Then A ∂  A = I

.

MATRIX AND ITS APPLICATIONS



TYPE IN YOUR TOPIC AND CLICK SEARCH.






RESEARCHWAP.NET
Researchwap.net is an online repository for free project topics and research materials, articles and custom writing of research works. We’re an online resource centre that provides a vast database for students to access numerous research project topics and materials. Researchwap.net guides and assist Postgraduate, Undergraduate and Final Year Students with well researched and quality project topics, topic ideas, research guides and project materials. We’re reliable and trustworthy, and we really understand what is called “time factor”, that is why we’ve simplified the process so that students can get their research projects ready on time. Our platform provides more educational services, such as hiring a writer, research analysis, and software for computer science research and we also seriously adhere to a timely delivery.

TESTIMONIES FROM OUR CLIENTS


Please feel free to carefully review some written and captured responses from our satisfied clients.

  • "Exceptionally outstanding. Highly recommend for all who wish to have effective and excellent project defence. Easily Accessable, Affordable, Effective and effective."

    Debby Henry George, Massachusetts Institute of Technology (MIT), Cambridge, USA.
  • "I saw this website on facebook page and I did not even bother since I was in a hurry to complete my project. But I am totally amazed that when I visited the website and saw the topic I was looking for and I decided to give a try and now I have received it within an hour after ordering the material. Am grateful guys!"

    Hilary Yusuf, United States International University Africa, Nairobi, Kenya.
  • "Researchwap.net is a website I recommend to all student and researchers within and outside the country. The web owners are doing great job and I appreciate them for that. Once again, thank you very much "researchwap.net" and God bless you and your business! ."

    Debby Henry George, Massachusetts Institute of Technology (MIT), Cambridge, USA.
  • "Great User Experience, Nice flows and Superb functionalities.The app is indeed a great tech innovation for greasing the wheels of final year, research and other pedagogical related project works. A trial would definitely convince you."

    Lamilare Valentine, Kwame Nkrumah University, Kumasi, Ghana.
  • "I love what you guys are doing, your material guided me well through my research. Thank you for helping me achieve academic success."

    Sampson, University of Nigeria, Nsukka.
  • "researchwap.com is God-sent! I got good grades in my seminar and project with the help of your service, thank you soooooo much."

    Cynthia, Akwa Ibom State University .
  • "Sorry, it was in my spam folder all along, I should have looked it up properly first. Please keep up the good work, your team is quite commited. Am grateful...I will certainly refer my friends too."

    Elizabeth, Obafemi Awolowo University
  • "Am happy the defense went well, thanks to your articles. I may not be able to express how grateful I am for all your assistance, but on my honour, I owe you guys a good number of referrals. Thank you once again."

    Ali Olanrewaju, Lagos State University.
  • "My Dear Researchwap, initially I never believed one can actually do honest business transactions with Nigerians online until i stumbled into your website. You have broken a new legacy of record as far as am concerned. Keep up the good work!"

    Willie Ekereobong, University of Port Harcourt.
  • "WOW, SO IT'S TRUE??!! I can't believe I got this quality work for just 3k...I thought it was scam ooo. I wouldn't mind if it goes for over 5k, its worth it. Thank you!"

    Theressa, Igbinedion University.
  • "I did not see my project topic on your website so I decided to call your customer care number, the attention I got was epic! I got help from the beginning to the end of my project in just 3 days, they even taught me how to defend my project and I got a 'B' at the end. Thank you so much researchwap.com, infact, I owe my graduating well today to you guys...."

    Joseph, Abia state Polytechnic.
  • "My friend told me about ResearchWap website, I doubted her until I saw her receive her full project in less than 15 miniutes, I tried mine too and got it same, right now, am telling everyone in my school about researchwap.com, no one has to suffer any more writing their project. Thank you for making life easy for me and my fellow students... Keep up the good work"

    Christiana, Landmark University .
  • "I wish I knew you guys when I wrote my first degree project, it took so much time and effort then. Now, with just a click of a button, I got my complete project in less than 15 minutes. You guys are too amazing!."

    Musa, Federal University of Technology Minna
  • "I was scared at first when I saw your website but I decided to risk my last 3k and surprisingly I got my complete project in my email box instantly. This is so nice!!!."

    Ali Obafemi, Ibrahim Badamasi Babangida University, Niger State.
  • To contribute to our success story, send us a feedback or please kindly call 2348037664978.
    Then your comment and contact will be published here also with your consent.

    Thank you for choosing researchwap.com.