Term
Theorem 1.1 - Uniqueness of the Reduced Echelon Form |
|
Definition
Each matrix is row equivalent to one and only one reduced echelon matrix |
|
|
Term
Theorem 1.2 - Existence and Uniqueness Theorem |
|
Definition
A linear system is consistent if and only if the rightmost column of the augmented matrix is not a pivot column - that is, if and only if an echelon form of the augmented matrix has no row of the form [0...0 b] with b nonzero.
If a linear system is consistent, then the soluction set contains either (i) a unique solution, when there are no free vaiables, or (ii) infinitely many solution, when there is at least one free variable. |
|
|
Term
|
Definition
If A is an m x n matrix, with columns a1,...,an, and if b is in R^m, the matrix equation Ax = b has the same soluction set as the vector equation x1a1 + x2a2+...xnan = b which, in turn, has the same soluction set as the system of linear equations whose aumented matrix is [a1 a2 ... an b] |
|
|
Term
|
Definition
Let A be an m x n matrix. Then the following statements are logically equivalent. That is, for a particular A, either they are all true statements or they are all false.
a. For each b in R^m, the equation Ax = b
b. Each b in R^m is a linear combination of the columns of A
c. The columns of A span R^m
d. A has a pivot position in every row |
|
|
Term
|
Definition
If is an m x n matrix, u and v are vectors in R^n, and c is a scalar, then: a. A(u+v) = Au + Av b. A(cu) = c(Au) |
|
|
Term
|
Definition
Suppose the equation Ax = b is consistent for some given b, and let p be a solution. Then the solution set of Ax = b is the set of all vectors of the form w = p + vh, where vh is any solution of the homogeneous equation Ax = 0. |
|
|
Term
Theorem 1.7 - Characterization of Linearly Dependent Sets |
|
Definition
An indexed set S = {v1,...,vp} of two or more vectors in S is a linear combination of the others. In fact, if S is linearly dependent and v1=/0, then some vj (with j>1) is a linear combination of the preceding vectors, v1,...,vj-1. |
|
|
Term
|
Definition
: If a set contains more vectors than there are entries in each vector, then the set is linearly dependent. That is, any set {v1, … , vp} in Rn is linearly dependent if p > n. |
|
|
Term
|
Definition
If a set S = {v1, … , vp} in Rn contains the zero vector, then the set is linearly dependent. |
|
|
Term
|
Definition
Let T: R^n->R^m be a linear transformation. Then there exists a unique matrix A such that T(x) = Ax for all x in R^n In fact, A is the m x n matrix shoes jth column is the vector T(ej), where ej is the jth column of the identity matrix in Rn: A = [ T(e1) … T(en) ] |
|
|
Term
|
Definition
Let T: Rn->Rm be a linear transformation and let A be the standard matrix for T. Then: a. T maps R^n onto R^m if and only if the columns of A span R^m b. T is one-to-one if and only if the columns of A are linearly independent. |
|
|
Term
|
Definition
Let A and B denote matrices whose sizes are appropriate for the following sums and products. a. (A^T)^T = A b. (A + B)^T = A^T + B^T c. For any scalar r, (rA)^T = rA^T d. (AB)^T = B^T A^T |
|
|
Term
|
Definition
Let A = [ a b ][ c d ]. If ad — bc =/ 0, then A is invertable and A-1 = 1/(ad — bc) * [ d -b ][ -c a ] If ad — bc = 0, then A is not invertable. |
|
|
Term
|
Definition
If A is an invertible n x n matrix, then for each b in R^n, the equation Ax = b has the unique solution x = A^-1b. |
|
|
Term
|
Definition
a. If A is an invertible matrix, then A-1 is invertable and (A-1)-1 = A b. If A and B are n x n invertible matrices, then so is AB, and the inverse of AB is the product of the inverses of A and B in the reverse order. That is,(AB)-1 = B-1 A-1 c. If A is an invertible matrix, then so is A^T, and the inverse of A^T is the transpose of A-1. That is, (AT)-1 = (A-1)^T |
|
|
Term
|
Definition
An n x n matrix A is invertible if and only if A is row equivalent to In, and in this case, any sequence of elementary row operations that reduces A to In also transforms In into A-1. |
|
|
Term
|
Definition
Let A be a square n x n matrix. Then the following statements are equivalent. That is, for a given A, the statements are either all true or all false. a. A is an invertible matrix. b. A is row equivalent to the n x n identity matrix. c. A has n pivot positions. d. The equation Ax = 0 has only the trivial solution. e. The columns of A form a linearly independent set. f. The linear transformation x->Ax is one-to-one. g. The equation Ax = b has at least one solution for each b in R^n. h. The columns of A span R^n. i. The linear transformation x->Ax maps R^n onto R^n. j. There is an n x n matrix C such that CA = I. k. There is an n x n matrix D such that AD = I. l. A^T is an invertible matrix. |
|
|
Term
|
Definition
Column Row Expansion of AB — If A is m x n and B is n x p, then AB = [col 1(A) col 2(A) … col n(A)][row 1(B), row 2(B), … ,row n(B)] = col 1(A)row 1(B) + … + col n(A)row n(B) |
|
|
Term
|
Definition
The null space of an m x n matrix A is a subspace of R^n. Equivalently, the set of all solutions to a system Ax = 0 of m homogeneous linear equations in n unknowns is a subspace of R^n |
|
|
Term
|
Definition
The pivot columns of a matrix A form a basis for the column space of A |
|
|