Exam 2 Study Guide: Cumulative + 1.8, 1.9, 2.7, 2.8, 3.1-3.3, 5.1, 5.2, parts of 5.6, 6.1, and applications

At the Test

  • You may make yourself some reference notes on the small card I hand out (additional cards are on my door if you need to rewrite it). The reference card must be handwritten. Think of the card as a way to include some important examples or concepts that you aren't as comfortable with. You won't have room for everything, and you should try to internalize as much as you can.
  • A calculator will be allowed (but no cell phone nor other calculators bundled in combination with additional technologies) but is not required.
  • You may have out food, hydration, ear plugs, or similar if they will help you (however any ear plugs must be standalone--no cell phone, internet or other technological connections)
  • There will be various types of questions on the test and your grade will be based on the quality of your responses in a timed environment.
  • Partial credit will be given, so (if you have time) showing your reasoning or thoughts on questions you are unsure of can help your grade.

    Here is a sample partial test and solutions
    so that you can see an example of the formatting and style of questions. As listed there you will see three sections that are typeset professionally (using LaTeX):

  • Fill in the blank
  • Computations and Interpretations / Analyses
  • True/False Questions
    As such the test will be a mixture of computational and definition questions as well as critical reasoning and questions involving the "big picture." Some students in the past have reported that they found it helpful to go through the sample partial test, review other problems we covered, skim solutions, and read through the glossary on ASULearn. Videos are available to help you review. I'm happy to help you in office hours, Zoom, or on the ASULearn forums--for example, you can bring in the sample test and go through it with me, or ask me any other questions you have. We'll also spend part of the class before the test on some review activities, and I will take some class time to answer any questions.

    Problems Covered

    Most questions are adapted from or taken right from exercises we had for homework, problem set questions, and clicker questions, although they may be rephrased or repackaged to further develop critical thinking and problem-solving skills (I'm trying to help you develop your linear independence!), so I suggest that you review those solutions on ASULearn and from notes you took during class. I am happy to help you with any related material you need to brush up on.

    worksheet on guess the transformation
    1.9 24 a & 24 e
    2.7 1, 3, 5, 7, 9
    1.6 7, 9, 10, 14, 15, 16, 28
    worksheet on definitions in 2.7 and 1.6
    clicker questions on linear transformations
    glossary for linear transformations
    3.1 1, 15, 21, 25, 39b, 46
    3.2 37, 42
    3.3 19, 23, 25
    worksheet on determinants
    clicker questions in chapter 3
    2.8 11, 13, 17, 21 c, d, 23, 38
    worksheet on 2.8
    clicker questions in 2.8
    5.1 2, 3, 11, 21 c, d, e, 24, 31, 33, 36
    5.2 1, 2
    5.6 1, 2, 3, 5, 7, 10
    worksheet on chapter 5
    worksheet on definitions in 3, 2.8, and 5
    clicker questions in chapter 5
    glossary for 6.1, chapter 3, and chapter 5 terms
    review worksheet #1
    review worksheet #2
    Problem Set 4

    Topics Covered

    review slides
    class highlights page
  • Brush up on any exam 1 material. Exam 2 will be a majority of new material but questions from exam 1 may appear on exam 2. Many newer concepts extend prior ones (theorem 8 in 2.3 with determinants, parameterizing homogeneous equations in 1.5 that are now the nullspace, span of vectors in 1.3 that are now the column space, span in 1.3 and l.i. in 1.7 that are combined to give a basis....).
  • Linear Transformations of the plane, both 2x2 as well as 3x3 homogeneous coordinates versions. Know the following (which are on the review LaTeX slide):
    general rotation matrix
    projections onto the y=x line, and the x and y axes
    reflections across the y=x line, and the x and y axes
    horizontal shear
  • Rotate about a point, like (4,9): Translate by (4,9).Rotate.Translate by (-4,-9)
  • Composition of linear transformations: right to left (ABCx means first C(x) then B applied to that then A applied to that result, just like with function composition)
  • If the composition is in the wrong order, then it won't give us the intended action, like a car flying off a track.
  • Length and angle of a vector and orthogonal vectors and their use in computer graphics applications (turning a car on a track and preserving the size)
  • Big picture ideas of Yoda and transpose of a matrix
  • Computer speed of using associativity on (AB).Large matrix=A(B.Large matrix) including counting the number of multiplications and reasoning that (AB).Large matrix is much faster
  • Algebra of determinants, including the Laplace expansion
  • The connection of determinants to the square matrix theorem 8
  • Geometry of determinants, including the impact of row operations
  • A deeper understanding of the algebra and geometry of three topics we had previously covered:
    1. replacement Gaussian reductions t (row 1) + (row 2) [parallel to row 1 through the tip of row 2. The elementary matrix is a shear (as a linear transformation) that preserves the area of parallelogram formed by the vectors in the original matrix that gets eventually turned to a rectangle via replacement row operations (or volume of parallelopiped that gets turned into a rectangular prism in higher dimensions) which is the same as the determinant!]
    2. The span of the columns of a matrix A, which is the set of linear combinations, is now seen as the column space. For example, if A has 2 columns, then c(column 1) +d (column 2) is a plane through the origin if the columns are l.i. and a line through the origin otherwise. If A has 3 or more columns, we need to do some computational work (like in problem set 4) to examine the column space and find a basis (l.i. + span) by using the original pivot columns
    3. The solutions to the equations for linearly independence (of the columns of a matrix A) Ax=0, which are the intersections of the rows of this homogeneous augmented matrix, is the nullspace, and we parameterize those solutions to find a basis.
  • Algebra and geometry of subspaces, basis, column space and null space and the connections to previous material like linear independence and span (see above)
  • Algebra and geometry of eigenvalues and eigenvectors [Ax=lambdax, (A-lambda I)x=0, matrix multiplication turns to scalar multiplication for eigenvectors, so they are vectors that are on the same line through the origin]
  • Showing that solving Ax=lambdax is equivalent to solving the system (A-lambdaI)x=0 (i.e. the eigenvectors of A are the nullspace of (A-lambdaI). We concentrated on 2x2 matrices.
  • Understanding that since we want nontrivial solutions of (A-lambdaI)x=0, we solve for lambda using determinant(A-lambdaI)=0 (since otherwise the system would have only the 0 vector trivial solution-so we want the columns of (A-lambdaI) to not be linearly independent), and being able to solve for the lambdas given a 2x2 matrix.
  • 2x2 matrices which have a certain number of real eigenvalues (0,1,2) and eigenvectors (0 or infinitely many) or linearly independent eigenvectors (0,1,2).
  • Maple output of the Eigenvalues and Eigenvectors command
  • linear transformations of the plane and their corresponding eigenvalues and eigenvectors (projections, dilations, reflections, rotations, and shear matrices)
  • Eigenvector decomposition of a 2x2 matrix A when the eigenvectors form a basis for R2 and the long-term behavior of dynamical systems, including the limit and population ratios in situations of asymptotic behavior as well as stability (lambda=1) for various initial conditions (If a2=0... otherwise we die off, grow, stabilize...) including the equations of the lines we come in along) or predation parameters, like in problem set 4.
  • Filling in blanks like: If ___ equals 0 then we die off along the line____ [corresponding to the eigenvector____], and in all other cases we [choose one: die off or grow or hit and then stayed fixed] along the line____ [corresponding to the eigenvector____].
  • Identifying population ratios in the long term (from the dominant eigenvector)
  • Identifying population growth or death rate in the long term (from the largest eigenvalue - how much above or below 1 it is)
  • Rough sketch of a graph of the trajectory diagram when you begin in the first quadrant and start off the eigenvectors (and on too). For example, in stability situations when one eigenvalue is 1 then, say if x_k = a1 1k Vector([2,1]) + a2 .7k Vector([-1,1]) then as long as a1 is non zero, we will stabilize to the y=1/2 x line via the populations ratio of 2:1. Graphically you should be able to draw pictures like in the Dynamical Systems demo and problem set solutions In this specific example, you can tell from the algebra that given a starting position, you will come in parallel to Vector([-1,1]) (i.e. x+y=1) until we eventually hit the stability line, where we stay forever, and that the contribution from Vector([-1,1]) is smaller and smaller with each k, which is also represented in the picture. In other situations we approach asymptotically.
  • Fractions in Maple, versus errors with decimals in Maple or errors in other Maple outputs (ie the importance of critical reasoning/by-hand reasoning along with Maple).
  • Maple output of eigenvectors giving one basis representative for each line, or basis representatives for R2, or Maple outputting a column of 0s as an eigenvector, like for a shear matrix, (which tells us that the eigenvectors will not form a basis for R2 - because they won't be 2 linearly independent ones).
  • Some Maple Commands Here are some Maple commands you should be pretty familiar with by now for this test - i.e. I will at times show a command, and it may be with or without its output: > with(LinearAlgebra): with(plots):
    > A:=Matrix([[-1,2,1,-1],[2,4,-7,-8],[4,7,-3,3]]);
    > ReducedRowEchelonForm(A);
    > GaussianElimination(A);
    (only for augmented matrices with unknown variables like k or a, b, c in the augmented matrix)
    > Transpose(A);
    > ConditionNumber(A);
    (only for square matrices)
    > Determinant(A);
    > Eigenvalues(A);
    > Eigenvectors(A);
    > evalf(Eigenvectors(A));
    > Vector([1,2,3]);
    > B:=MatrixInverse(A);
    > A.B;
    > A+B;
    > B-A;
    > 3*A;
    > A^3;
    > evalf(M)
    > spacecurve({[4*t,7*t,3*t,t=0..1],[-1*t,2*t,6*t,t=0..1]},color=red, thickness=2);
    plot vectors as line segments in R3 (columns of matrices) to show whether the columns are in the same plane, etc.
    > implicitplot({2*x+4*y-2,5*x-3*y-1}, x=-1..1, y=-1..1);
    > display(a,b,c);
    > implicitplot3d({x+2*y+3*z-3,2*x-y-4*z-1,x+y+z-2},x=-4..4,y=-4..4,z=-4..4);
    plot equations of planes in R^3 (rows of augmented matrices) to look at the geometry of the intersection of the rows (ie 3 planes intersect in a point, a line, a plane, or no common points)