Test 3 Study Guide
This test will be closed to notes/books, but a calculator will be allowed
(but no cell phone nor other calculators bundled in combination with
additional technologies).
There will be various components
to the test and your grade will be based on the
quality of your responses in a timed environment (must be turned in by the
end of class).
Be sure to study tests 1 and 2 and any related material
that you need to brush up on (test 3 will be comprehensive), and
PS 6 solutions along with the
recent activities and demos on the class highlights page and ASULearn.
In the book, portions come from 7.1, 7.2, portions of 7.4, and some of
chapter 6.
Specifically, here are the topics we have been focusing on:
test 1 and 2 material (be sure to study test 1 and any related material
that you feel that you need to brush up on)
algebra and geometry of eigenvectors [Ax=lambdax,
(lambdaI-A)x=0,
matrix multiplication turns to scalar multiplication for eigenvectors,
so they are vectors that are on the same line through the origin]
number of eigenvectors, number of linearly independent eigenvectors
basis of eigenvectors
diagonalizability
linear transformations of the plane and their corresponding
eigenvalues and eigenvectors (projections, dilations, reflections,
rotations,and shear matrices)
revisiting previous problems - healthy and sick workers, Brand A, B,
neither, and understanding them at a deeper level
eigenvector decomposition and what happens in the limit (directions,
ratios...)
for various initial conditions (If a2=0... otherwise we die off, grow,
stabilize...) including the equations of the lines we come in along.
geometry of stability situations [for example, if
xk = a1 1k Vector([2,1]) + a2 .7k Vector([-1,1]) then
as long as a1 is non zero, we will stabilize to the y=1/2 x line via
the populations ratio of 2:1. Graphically you should be able to draw
pictures like in the Dynamical Systems demo - see the second last
picture in the demo for this situation. You can tell from the algebra that
given a starting position, you will come in parallel to Vector([-1,1]) (i.e.
x+y=1) until we eventually hit the stability line, where we stay forever,
and that the contribution from Vector([-1,1]) is smaller and smaller with
each k, which is also represented in the picture.
fractions in Maple, versus errors with decimals in Maple
Maple output of eigenvectors giving one basis representative for each
line, or 2 basis representatives for each plane.
In addition to the above, from
the ASULearn solutions and your old tests, in addition to reviewing them
in general, also be sure to carefully go over
PS 1 k or a,b,c problem.
PS 3 Healthy/Sick Workers and Practice Problem Brand A, Brand B,
Neither matrix: setting up stochastic matrices and stability as eigenvalue
of 1
PS 4 and PS 5 cement mixing problems,
PS 6 rotation matrix eigenvalue problem,
PS 6 fox problem
Summary of Equivalent Conditions for Square Matrices on p. 239 (don't worry about Rank as we did not cover this.)
vector space axioms 1 and 6 and their negations (ie the subspace axioms),
definitions and examples of basis, linear independence and span
definition and examples of eigenvalue and eigenvectors
You can expect to see problems which are similar to these and/or
problems that begin with a question like you've seen in an earlier section,
but use some of our recent language to explore further.
You can also expect to have a problems which ask you to give examples,
such as
examples from test 1 and 2, such as a matrix A and vector b so that
Ax=b has a certain number of solutions, and span and li examples in R2 and
R3
(see test 2 study guide for comments on span and
li along with test 2).
examples of 2x2 matrices which have a certain number of
real eigenvalues (0,1,2) and eigenvectors (0 or infinitely many) or
linearly independent eigenvectors (0,1,2).
For example, if I asked you to produce an
example of a matrix with 1 eigenvalue, and 1 linearly independent eigenvector,
then Matrix([[1,0],[1,1]]) would work (notice though that
it has infinitely many
eigenvectors that are not linearly independent, because constant multiples of
an eigenvector produces other eigenvectors, as anything on that same line
through the origin still stays on that line. This is a vertical shear matrix
with just the y-axis having eigenvalue 1. A basis for the line
can be represented by [0,1]).
For 2x2 matrices, we cannot have more than 2 linearly
independent eigenvectors because they would form a basis for R^2,
which has at most 2 linearly independent vectors in a basis, [but we can find
examples of 3x3 matrices that have exactly 3 linearly independent
eigenvectors].
Using geometric intituion can help quite a bit for problems too. For example,
we can use the geometry of a rotation, projection, etc, in order to explain
what (if any) the eigenvalues and eigenvectors are, and to generate examples
quickly, as we discussed in class, and you should know these from class notes.
Review why most rotation matrices have no real eigenvalues and eigenvectors
(same line through the origin arguments), why a projection matrix has
an eigenvalue 1 corresponding to [cos(theta), sin(theta)] and
an eigenvalue 0 corresponding to [-sin(theta), cos(theta)], etc.
Also think about some broader connections of the class - applications
that we have covered, algebraic and geometric perspectives, how
calculus has played a role, some of the historical perspectives...
Some Maple Commands
Here are some Maple commands you should be pretty familiar with by now
for this test - i.e. I will at times show a command, and it may be
with or without
its output:
> with(LinearAlgebra): with(plots):
> A:=Matrix([[-1,2,1,-1],[2,4,-7,-8],[4,7,-3,3]]);
> ReducedRowEchelonForm(A);
> GaussianElimination(A); (only for augmented
matrices with unknown variables like
k or a, b, c in the augmented matrix)
> B:=MatrixInverse(A);
> A.B;
> A+B;
> B-A;
> 3*A;
> A^3;
> evalf(A^100); or evalf((A^100).U); (be
careful to use fractions for stochastic matrices)
> Determinant(A);
> Vector([1,2,3]);
> Eigenvectors(M);
> Eigenvalues(M);
> evalf(Eigenvectors(M));
> spacecurve({[4*t,7*t,3*t,t=0..1],[-1*t,2*t,6*t,t=0..1]},color=red, thickness=2); plot vectors as line segments in R3
(columns of matrices) to show whether the the columns are in the same plane,
etc.
> implicitplot({2*x+4*y-2,5*x-3*y-1}, x=-1..1, y=-1..1);
> implicitplot3d({x+2*y+3*z-3,2*x-y-4*z-1,x+y+z-2},x=-4..4,y=-4..4,z=-4..4);
plot equations of planes in R^3 (rows of augmented matrices) to look
at the geometry of the intersection of the rows (ie 3 planes intersect in
a point, a line, a plane, or no common points)
There will be some fill in the blank short answer questions, such as
providing:
definitions related to any of the above topics, including test 1 and 2
material
real-life applications, like ____ is a real-life application of matrix
inversion (where the natural answer would be the Hill cipher)
fill in the blank related to computations, examples and
interpretations
There will be some by-hand computations and interpretations,
like those you have had previously for homework, clicker questions
and in the problem sets.
Derivations
There will be some
short derivations - the same as we've seen before, like:
Those on test 1 study guide and
test 2 study guide
Derivation that solving Ax=lambdax is equivalent to
solving the system
(lambdaI-A)x=0
Derivation that for eigenvectors x for A, Akx =
lambda kx
Derivation that
A P = P times the diagonal matrix of eigenvalues [which is how we showed that
MatrixInverse(P).A.P = Diag]
True/False
True/False statements and counterexamples has been a recurring theme
in all of the chapters, so you can expect problems like we have seen
before in practice problems and problem sets. Be sure that you know
how to find counterexamples
There will be questions were you answer true or false and if
false, then you will either correct the text after the
word "then" (that does not change equal to not equal for ex)
or provide a counterexample.
For example,
It is possible to find a matrix with 2 linearly independent eigenvectors
It is not possible to find a 2x2 matrix with 3 linearly independent
eigenvectors
It is false that eigenvectors allow us to turn matrix multiplication
into addition [should be scalar multiplication]
algebra and geometry of objects like Gaussian reductions like t (row 1) + (row 2) [parallel to row 1 through the tip of row 2 and the operation preserves area or volume which is the determinant] or linear combinations like c(column 1) +d (column 2) [a plane through the origin if the columns are l.i. and a line through the origin otherwise].
Others mentioned above before the Maple commands