### Test 3 Study Guide: Selections from 3.1-3.3, 2.8, 5.1 and 5.6:

This test will be closed to notes/books, but a calculator will be allowed (but no cell phone nor other calculators bundled in combination with additional technologies). There will be various types of questions on the test and your grade will be based on the quality of your responses in a timed environment (turned in by the end of class):
• Definitions
• Calculations and Interpretations / Analyses
• Conceptual Questions
• I suggest that you review tests 1 and 2 and any related material that you feel that you need to brush up on, as this test is cumulative. In addition, review your class notes from the material since test 2 and go over ASULearn solutions to the practice exercises and problem sets as well as the clicker questions. Questions will be typeset using LaTeX (like this) and will typically be formatted as follows:
• Fill in the Blank
• Short Answer Calculations and Interpretations / Analyses
• True/False Questions: You may be asked to identify true statements [no need to recall page #s nor Theorem #s like in problem sets], provide counterexamples, and/or correct statements
• Here are the topics we have been focusing on:
• Test 1 material [test 1 study guide]
• Test 2 material [test 2 study guide]

Recent material including selections from Chapter 3 (3.1-3.3), 2.8, 5.1 and 5.6:
• Algebra of determinants, including the Laplace expansion
• The connection of determinants to the square matrix theorem 8
• Geometry of determinants, including the impact of row operations
• A deeper understanding of the algebra and geometry of three topics we had previously covered:
1. replacement Gaussian reductions t (row 1) + (row 2) [parallel to row 1 through the tip of row 2. The elementary matrix is a shear (as a linear transformation) that preserves the area (or volume in higher dimensions) which is the same as the determinant!]
2. The span of the columns of a matrix A, which is the set of linear combinations, is now seen as the column space. For example, if A has 2 columns, then c(column 1) +d (column 2) is a plane through the origin if the columns are l.i. and a line through the origin otherwise. If A has 3 or more columns, we need to do some computational work (like in problem set 4) to examine the column space
3. The solutions to the equations for linearly independence (of the columns of a matrix A) Ax=0, which are the intersections of the rows of this homogeneous augmented matrix, is the nullspace.
• Algebra and geometry of subspaces, basis, column space and null space and the connections to previous material like linear independence and span (see above)
• Algebra and geometry of eigenvalues and eigenvectors [Ax=lambdax, (A-lambda I)x=0, matrix multiplication turns to scalar multiplication for eigenvectors, so they are vectors that are on the same line through the origin]
• Showing that solving Ax=lambdax is equivalent to solving the system (A-lambdaI)x=0 (ie the eigenvectors of A are the nullspace of (A-lambdaI). We concentrated in 2x2 matrices.
• Understanding that since we want nontrivial solutions of (A-lambdaI)x=0, we solve for lambda using deteterminant(A-lambdaI)=0 (since otherwise the system would have only the 0 vector trivial solution-so we want the columns of (A-lambdaI) to not be linearly independent), and being able to solve for the lambdas given a 2x2 matrix.
• 2x2 matrices which have a certain number of real eigenvalues (0,1,2) and eigenvectors (0 or infinitely many) or linearly independent eigenvectors (0,1,2).
• linear transformations of the plane and their corresponding eigenvalues and eigenvectors (projections, dilations, reflections, rotations, and shear matrices)
• Eigenvector decomposition of a 2x2 matrix A when the eigenvectors form a basis for R2 and the longterm behavior of dynamical systems, including the limit and population ratios in situations of asymptotic behavior as well as stability (lambda=1) for various initial conditions (If a2=0... otherwise we die off, grow, stabilize...) including the equations of the lines we come in along) or predation parameters, like in problem set 4.
• Filling in blanks like: If ___ equals 0 then we die off along the line____ [corresponding to the eigenvector____], and in all other cases we [choose one: die off or grow or hit and then stayed fixed] along the line____ [corresponding to the the eigenvector____].
• Identifying population ratios in the longterm (from the dominant eigenvector)
• Rough sketch of a graph of the trajectory diagram when you begin in the first quadrant and start off the eigenvectors (and on too). For example, in stability situations when one eigenvalue is 1 then, say if x_k = a1 1k Vector([2,1]) + a2 .7k Vector([-1,1]) then as long as a1 is non zero, we will stabilize to the y=1/2 x line via the populations ratio of 2:1. Graphically you should be able to draw pictures like in the Dynamical Systems demo and problem set solutions In this specifica example, you can tell from the algebra that given a starting position, you will come in parallel to Vector([-1,1]) (i.e. x+y=1) until we eventually hit the stability line, where we stay forever, and that the contribution from Vector([-1,1]) is smaller and smaller with each k, which is also represented in the picture. In other situations we approach asymptotically.
• Fractions in Maple, versus errors with decimals in Maple or errors in other Maple outputs (ie the importance of critical reasoning/by-hand reasoning along with Maple).
• Maple output of eigenvectors giving one basis representative for each line, or basis representatives for R2, or Maple outputing a column of 0s as an eigenvector, like for a shear matrix, (which tells us that the eigenvectors will not form a basis for R2 - because they won't be 2 linearly independent ones).
• Also think about some broader connections of the class - applications that we have covered, algebraic and geometric perspectives, and how calculus has played a role
• Some Sample Test Questions
• See the sample test questions for test 1 and test 2, as well as your tests themselves. Like the previous tests, almost all of test 3 will be from the practice hw, clicker questions and problem sets, as well as definitions, big picture ideas or computations that are listed on any of the study guides.

Fill in the blank
• An eigenvector _________(satisfies the equation A x = lambda x and allows us to turn matrix multiplication into scalar multiplication)
• Set up the augmented matrix corresponding to the definition of span for these vectors... __________
• ________ is a real-life application of a linear combinations of vectors

Computations and Interpretations/Analysis: As in the first 2 tests, the questions in this section come from questions you have seen before. Be sure to review the questions on the prior tests, Problem Set and practice solutions, and clickers

True/False True/False statements and counterexamples has been a recurring theme in all of the chapters, so you can expect problems like we have seen before in practice problems, problem sets and clickers. Be sure that you know how to find counterexamples. They will (again) either be formatted as "circle true" or "correct after the word..." or "provide a counterexample" (or could show up as fill in the blanks too). Here I have given you samples that connect to how many (and what types of) eigenvectors a matrix can have, just for review sake:
• A 2x2 matrix can never have just 1 linearly independent eigenvector (false the shear Matrix([[1,0],[1,1]]) would work (notice though that it has infinitely many eigenvectors that are not linearly independent, because constant multiples of an eigenvector produces other eigenvectors, as anything on that same line through the origin still stays on that line. This is a vertical shear matrix with just the y-axis having eigenvalue 1. A basis for the line can be represented by [0,1])
• A 2x2 matrix can never have just 0 linearly independent (real) eigenvectors (false Matrix([[0,-1],[1,0]]) is a rotation by 90 degrees which has no nontrivial real eigenvectors since everything but the origin is moved off the original line we begane on)
• A 2x2 matrix can never have 2 linearly independent eigenvectors (false Matrix([[1,0],[0,0]]) has the x-axis with basis representative as the first column corresponds to an eigenvalue of 1 (on the line of projection vectors are fixed), while the y-axis that is perpendicular to the line of projection has an eigenvalue of 0 (no shadow), and all other vectors are moved off their line as they are projected. Reflection, like in the practice problem, which has eigenvalues 1 and -1 corresponding to the line of reflection and the one perpendicular to it, would have worked here too, as would rotation by 180 degrees or a dilation matrix.
• A 2x2 matrix can never have 3 linearly independent eigenvectors (true - linearly independent vectors would form a basis for the eigenspace, but in R2 we at most have 2 l.i. vectors before achieving redundancy.)
• A 2x2 matrix can never have just 2 eigenvectors (true - if we have any non-trivial eigenvector then anything on that line is also an eigenvector so we'll have infinite eigenvectors. We CAN have just 2 linearly independent eigenvectors - these are basis representatives like Maple gives - but they span the eigenspace that has infinitely many eigenvectors.
• Eigenvectors allow us to turn matrix multiplication into addition (false - should be scalar multiplication)

A reminder of types of possible true/false formatting on the test:
• Correct after the word then or write true:
If A is an invertible nxn matrix, and x and b are nx1 vectors, then the matrix-vector equation A*x=b has a unique solution. (True)
• Correct after the word has or write true:
A consistent augmented matrix with a row of 0s has infinite solutions (False: correction is 1 or infinite solutions)
• Provide a counterexample or write true:
A consistent augmented matrix with a row of 0s has infinite solutions (False: Matrix([[1,0,5],[0,1,2],[0,0,0]])
• Some Maple Commands Here are some Maple commands you should be pretty familiar with by now for this test - i.e. I will at times show a command, and it may be with or without its output:
> with(LinearAlgebra): with(plots):
> A:=Matrix([[-1,2,1,-1],[2,4,-7,-8],[4,7,-3,3]]);
> ReducedRowEchelonForm(A);
> GaussianElimination(A);
(only for augmented matrices with unknown variables like k or a, b, c in the augmented matrix)
> Transpose(A);
> ConditionNumber(A);
(only for square matrices)
> Determinant(A);
> Eigenvalues(A);
> Eigenvectors(A);
> evalf(Eigenvectors(A));
> Vector([1,2,3]);
> B:=MatrixInverse(A);
> A.B;
> A+B;
> B-A;
> 3*A;
> A^3;
> evalf(M)
> spacecurve({[4*t,7*t,3*t,t=0..1],[-1*t,2*t,6*t,t=0..1]},color=red, thickness=2);
plot vectors as line segments in R3 (columns of matrices) to show whether the the columns are in the same plane, etc.
> implicitplot({2*x+4*y-2,5*x-3*y-1}, x=-1..1, y=-1..1);
> display(a,b,c);
> implicitplot3d({x+2*y+3*z-3,2*x-y-4*z-1,x+y+z-2},x=-4..4,y=-4..4,z=-4..4);
plot equations of planes in R^3 (rows of augmented matrices) to look at the geometry of the intersection of the rows (ie 3 planes intersect in a point, a line, a plane, or no common points)