Test 3 Study Guide: Cumulative + 2.8(146-150), 3.1(163-167), 3.2(169-172), 3.3(180-181), 5.1(265-269), 5.6(301-304) & apps

This test will be closed to notes/books, but a calculator will be allowed (but no cell phone nor other calculators bundled in combination with additional technologies). There will be various types of questions on the test and your grade will be based on the quality of your responses in a timed environment.

The formatting will be just like test 1 and 2. The test is cumulative but will focus on new material. Here is

  • sample test questions and
  • solutions As before you will see three sections that are typeset formally (using LaTeX):

  • Fill in the Blank
  • Computations and Interpretations / Analyses
  • True/False Questions
  • As such the test will be a mixture of computational and definition questions as well as critical reasoning and questions involving the "big picture." Most questions are adapted from or taken right from exercises we had for homework, problem set questions, and clicker questions, although they may be rephrased or repackaged to further develop critical thinking and problem-solving skills (I'm trying to help you develop your linear independence), so I suggest that you review those solutions and any related material you need to brush up on. Partial credit will be given, so (if you have time) showing your reasoning or thoughts on questions you are unsure of can help your grade.

    Here are the topics we have been focusing on: (and here is the brief overview from class)
  • Test 1 [test 1 study guide], Test 2 [test 2 study guide]. Test 3 will be a majority of new material but still have portions of previous material - questions from test 1 and 2 may appear on test 3, and you should review any sections you need to brush up on that connect to new material (theorem 8 with determinants, parameterizing homogeneous equations in 1.5 that are now the nullspace, span of vectors in 1.3 that are now the column space, span in 1.3 and l.i. in 1.7 that are combined to give a basis....)
  • Algebra of determinants, including the Laplace expansion
  • The connection of determinants to the square matrix theorem 8
  • Geometry of determinants, including the impact of row operations
  • A deeper understanding of the algebra and geometry of three topics we had previously covered:
    1. replacement Gaussian reductions t (row 1) + (row 2) [parallel to row 1 through the tip of row 2. The elementary matrix is a shear (as a linear transformation) that preserves the area of parallelogram formed by the vectors in the original matrix that gets eventually turned to a rectangle via replacement row operations (or volume of parallelopiped that gets turned into a rectangular prism in higher dimensions) which is the same as the determinant!]
    2. The span of the columns of a matrix A, which is the set of linear combinations, is now seen as the column space. For example, if A has 2 columns, then c(column 1) +d (column 2) is a plane through the origin if the columns are l.i. and a line through the origin otherwise. If A has 3 or more columns, we need to do some computational work (like in problem set 4) to examine the column space and find a basis (l.i. + span) by using the original pivot columns
    3. The solutions to the equations for linearly independence (of the columns of a matrix A) Ax=0, which are the intersections of the rows of this homogeneous augmented matrix, is the nullspace, and we parameterize those solutions to find a basis.
  • Algebra and geometry of subspaces, basis, column space and null space and the connections to previous material like linear independence and span (see above)
  • Algebra and geometry of eigenvalues and eigenvectors [Ax=lambdax, (A-lambda I)x=0, matrix multiplication turns to scalar multiplication for eigenvectors, so they are vectors that are on the same line through the origin]
  • Showing that solving Ax=lambdax is equivalent to solving the system (A-lambdaI)x=0 (ie the eigenvectors of A are the nullspace of (A-lambdaI). We concentrated on 2x2 matrices.
  • Understanding that since we want nontrivial solutions of (A-lambdaI)x=0, we solve for lambda using determinant(A-lambdaI)=0 (since otherwise the system would have only the 0 vector trivial solution-so we want the columns of (A-lambdaI) to not be linearly independent), and being able to solve for the lambdas given a 2x2 matrix.
  • 2x2 matrices which have a certain number of real eigenvalues (0,1,2) and eigenvectors (0 or infinitely many) or linearly independent eigenvectors (0,1,2).
  • Maple output of the Eigenvalues and Eigenvectors command
  • linear transformations of the plane and their corresponding eigenvalues and eigenvectors (projections, dilations, reflections, rotations, and shear matrices)
  • Eigenvector decomposition of a 2x2 matrix A when the eigenvectors form a basis for R2 and the longterm behavior of dynamical systems, including the limit and population ratios in situations of asymptotic behavior as well as stability (lambda=1) for various initial conditions (If a2=0... otherwise we die off, grow, stabilize...) including the equations of the lines we come in along) or predation parameters, like in problem set 4.
  • Filling in blanks like: If ___ equals 0 then we die off along the line____ [corresponding to the eigenvector____], and in all other cases we [choose one: die off or grow or hit and then stayed fixed] along the line____ [corresponding to the eigenvector____].
  • Identifying population ratios in the longterm (from the dominant eigenvector)
  • Identifying population growth or death rate in the longterm (from the largest eigenvalue - how much above or below 1 it is)
  • Rough sketch of a graph of the trajectory diagram when you begin in the first quadrant and start off the eigenvectors (and on too). For example, in stability situations when one eigenvalue is 1 then, say if x_k = a1 1k Vector([2,1]) + a2 .7k Vector([-1,1]) then as long as a1 is non zero, we will stabilize to the y=1/2 x line via the populations ratio of 2:1. Graphically you should be able to draw pictures like in the Dynamical Systems demo and problem set solutions In this specific example, you can tell from the algebra that given a starting position, you will come in parallel to Vector([-1,1]) (i.e. x+y=1) until we eventually hit the stability line, where we stay forever, and that the contribution from Vector([-1,1]) is smaller and smaller with each k, which is also represented in the picture. In other situations we approach asymptotically.
  • Fractions in Maple, versus errors with decimals in Maple or errors in other Maple outputs (ie the importance of critical reasoning/by-hand reasoning along with Maple).
  • Maple output of eigenvectors giving one basis representative for each line, or basis representatives for R2, or Maple outputting a column of 0s as an eigenvector, like for a shear matrix, (which tells us that the eigenvectors will not form a basis for R2 - because they won't be 2 linearly independent ones).
  • Some Maple Commands Here are some Maple commands you should be pretty familiar with by now for this test - i.e. I will at times show a command, and it may be with or without its output:
    > with(LinearAlgebra): with(plots):
    > A:=Matrix([[-1,2,1,-1],[2,4,-7,-8],[4,7,-3,3]]);
    > ReducedRowEchelonForm(A);
    > GaussianElimination(A);
    (only for augmented matrices with unknown variables like k or a, b, c in the augmented matrix)
    > Transpose(A);
    > ConditionNumber(A);
    (only for square matrices)
    > Determinant(A);
    > Eigenvalues(A);
    > Eigenvectors(A);
    > evalf(Eigenvectors(A));
    > Vector([1,2,3]);
    > B:=MatrixInverse(A);
    > A.B;
    > A+B;
    > B-A;
    > 3*A;
    > A^3;
    > evalf(M)
    > spacecurve({[4*t,7*t,3*t,t=0..1],[-1*t,2*t,6*t,t=0..1]},color=red, thickness=2);
    plot vectors as line segments in R3 (columns of matrices) to show whether the columns are in the same plane, etc.
    > implicitplot({2*x+4*y-2,5*x-3*y-1}, x=-1..1, y=-1..1);
    > display(a,b,c);
    > implicitplot3d({x+2*y+3*z-3,2*x-y-4*z-1,x+y+z-2},x=-4..4,y=-4..4,z=-4..4);
    plot equations of planes in R^3 (rows of augmented matrices) to look at the geometry of the intersection of the rows (ie 3 planes intersect in a point, a line, a plane, or no common points)