IDENTITY MATRIX (Social Science)

The identity matrix In is an n X n matrix with 1s along the main diagonal and 0s in the off-diagonal elements. It can be written as In = diag (1, 1____1). For instance, for n = 3, the matrix looks like

tmp137-30_thumb

The columns of the identity matrix are known as the unit vectors. For the above example, these are e1 = (1 0 0)’, e2 = (0 1 0)’, and e3 = (0 0 1)’. If the dimension of the matrix is 1 X 1, the matrix reduces to the scalar 1. The identity matrix has the following properties:

1. It is square, that is, it has the same number of rows and columns.

2. It is symmetric, that is, transposing rows with columns (or vice versa) we obtain the matrix itself, that is, I = I’where I’ is the transpose matrix.

3. It is idempotent, that is, 12 = I; in the scalar case this is equivalent to I 2 = 1.

4. For any n X n matrix A, multiplication by the identity matrix delivers the matrix A itself, that is, AI = A ; in the scalar case this is equivalent to a X 1 = a.


5. It has the commutative property, that is, for any n X n matrix A, AI = IA = A ; in the scalar case, this is equivalent to a X 1 = 1 X a = a.

6. For any nonsingular n X n matrix A, there exists a matrix A such that AA = A _1A = I where A is called the inverse matrix of A. In the scalar case, this property is equivalent to the inverse operation of

tmp137-31_thumb

7. It has full rank; the n columns (or the n rows) of the matrix are linearly independent vectors and consequently the determinant is different from zero. The only symmetric, idempotent, and full rank matrix is the identity matrix.

8. Because I is a diagonal matrix, its determinant is equal to the product of the elements in the main diagonal, which in this case is equal to 1 regardless of the dimension of the matrix. A positive determinant is a necessary and sufficient condition for the identity matrix to be a positive definite matrix. The trace of the identity matrix is tr In = n, which is the sum of the elements in the main diagonal.

9. The n eigenvectors of the identity matrix are the unit vectors, and all the n eigenvalues are equal to 1.

Matrix algebra is a fundamental tool for the econometric analysis of general regression models. Classical estimation methodologies such as Ordinary Least Squares (OLS), Nonlinear Least Squares, Generalized Least Squares, Maximum Likelihood, and the Method of Moments rely on matrix algebra to derive their estimators and their properties in an elegant and compact format. The identity matrix shows up in several technical proofs. For instance, the identity matrix is an integral part of a projection matrix. In the OLS regression of y on Xwith a sample of size n, the projection matrix is P = (In – X(X’ X)-1 X’). It is important because when P is applied to a vector such as y, the result is the fitted values of y through the regression, that is, y = Py.

Next post:

Previous post: