- Write down and turn in your topic and name(s) [one per group]
- Fill out the "Planning for the Future of Math 2240" handout and turn it in up front in the envelope. Do NOT list your name.
- Take any questions on test revisions or the final research sessions
- Discuss the Final Research sessions - share topics with each other, what session each person is in, and peer review.
- Upper level courses I teach include Differential Geometry MAT 4140, Senior Capstone MAT 4040, Instructional Assistant MAT 3520
- Course evaluations

Take questions on the final project

Look at MatrixInverse(P).A.P, which has the eigenvalues on the diagonal - definition of diagonalizability and similarity.

Derivation that for eigenvectors

Derivation that A P = P times the diagonal matrix of eigenvalues [which is how we showed that MatrixInverse(P).A.P = Diag]

Execute in Maple:

A:=Matrix([[(cos(theta))^2,cos(theta)*sin(theta)],[cos(theta)*sin(theta), ((sin(theta))^2)]]);

h,P:=Eigenvectors(A)

Diag:=simplify(MatrixInverse(P).A.P);

What geometric transformation is Diag?

Notice that P.Diag.MatrixInverse(P) = A by matrix algebra.

Writing out a transformation in terms of a P, the inverse of P, and a diagonal matrix will prove very useful in computer graphics [Recall that we read matrix composition from right to left].

Linear Transformations

Mention the spectrum, the spectrum of the Laplacian [divergence of gradient], heat equation...

If the reduced augmented matrix for the system (A-lambdaI)x=0 is Matrix([[0,0,0],[0,0,0]]) then the (real) eigenvectors of A are:

a) Just the 0 vector works

b) A line through the origin

c) All of R

d) A subspace of R

e) None of the above

True or False:

a) True

b) False and I have a correction

c) False and I have a counterexample

d) False and I have both a correction and a counterexample

e) False but I have neither a correction nor a counterexample

#6-8 in 2.8 clicker questions.

Chap 5 clicker questions

In Maple execute

Eigenvectors(Matrix([[1,2],[2,1]])); and

ReducedRowEchelonForm(Matrix([[1,1,1],[1,-1,5]]))

Eigenvector clicker questions.

Explain why the eigenvectors of Matrix([[1,2],[2,1]]) satisfy the definitions of span and li by setting up the corresponding equations and solving.

li := [P|Vector([0,0])]

span:=[P|Vector([a,b])]

Eigenvector decomposition for a diagonalizable matrix A [where the eigenvectors form a basis]

Foxes and Rabbits demo on ASULearn

Dynamical Systems and Eigenvectors on ASULearn

Define eigenvalues and eigenvectors [Ax=lambdax, vectors that are scaled on the same line through the origin, matrix multiplication is turned into scalar multiplication].

Geometry of Eigenvectors.

Algebra: Show that we can solve using det(lambdaI-A)=0 and (lambdaI-A)x=0.

Compute the eigenvectors of Matrix([[0,1],[1,0]] by-hand and compare with Maple's work.

Eigenvectors and eigenvalues of Matrix([[1,2],[2,1]) in Maple.

Begin 2.8 in order to lead to eigenvalues and applications (2.8, 4.9 and 5.1, 5.2, 5.3 and 5.6 selections, 7.1 as time allows).

Continue determinant work via the relationship of row operations to the geometry of determinants via a demo on ASULearn. Prove that det A non-zero can be added into Theorem 8 in Chapter 2. Algebraic and geometric derivations related to the determinant.

Continue clicker questions on inverses and determinants

Discuss Yoda via the file yoda2.mw with data from Lucasfilm LTD as on Tim's Page which has the data.

Begin chapter 3 via a google search:

application of determinants in physics

application of determinants in economics

application of determinants in chemistry

application of determinants in computer science

Eight queens and determinants Chapter 3 in Maple via MatrixInverse command for 2x2 and 3x3 matrices and then determinant work, including 2x2 and 3x3 diagonals methods, and Laplace's expansion method in general.

Finish the last guess the Transformation on ASULearn [1.8, 1.9]

Review the unit circle

general geometric transformations on R

Computer graphics Demo on ASULearn [2.7]

Computer graphics and linear transformations (1.8, 1.9, 2.3 and 2.7)

Begin with dilations

Revisit Theorem 8 in 2.3 by incorporating the language of linear transformations [while also covering 1-1 and onto material in 1.9]

2.3 clicker questions

2.1 clicker questions #10

Catalog description:

-2.1-2.3 Applications: Coding, Condition Number and Linear Transformations (2.3, 1.8, 1.9 and 2.7)

-Chapter 3 determinants and applications

-Eigenvalues and applications (2.8, 4.9 and chap 5 selections, 7.1... as time allows)

-Final research sessions [research a topic related to the course that you are interested in]

Hill Cipher

A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z |

1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 |

Condition # of matrices

Maple file on Coding and Condition Number and PDF version

Finish 2.2. 2.2 clicker questions.

Begin 2.3.

MatrixInverse(twobytwo);

three := Matrix([[a, b, c], [d, e, f], [g, h, i]]);

scalerow3 := Matrix([[1, 0, 0], [0, 5, 0], [0, 0, 1]]);

scalerow3.three;

swaprows13 := Matrix([[0, 0, 1], [0, 1, 0], [1, 0, 0]]);

swaprows13.three;

usualrowop := Matrix([[1, 0, 0], [0, 1, 0], [-2, 0, 1]]);

usualrowop.three; corrections

Problem Set 2 clicker questions Hand out the study guide and take questions on test 1.

Continue via 2.1 clicker questions

Powerpoint file.

Matrix multiplication

Matrix algebra

Linear maps

Algebra of matrix multiplication: AB and BA...

End of Material for Test 1

Image 1 Image 2 Image 3 Image 4 Image 5 Image 6 Image 7.

Problem Set 2 clicker questions

In R^2, span but not li, li but not span, li plus span. R^3.

Coffee mixing clicker questions

Chap 1 review clicker questions

1.5: vector parametrization equations of homogeneous and non-homogeneous equations.

1.1-1.4 clicker questions

History of linear equations and the term "linear algebra" images, including the Babylonians 2x2 linear equations, the Chinese 3x3 column elimination method over 2000 years ago, Gauss' general method arising from geodesy and least squares methods for celestial computations, and Wilhelm Jordan's contributions.

Gauss quotation. Gauss was also involved in other linear algebra, including the history of vectors, another important "linear" algebra.

vectors, scalar mult and addition, linear combinations and weights, vector equations and connection to 1.1 and 1.2 systems of equations and augmented matrix, span

1.3 clicker questions 1, 2, 4, and and 6.

1.1 and 1.2 Clicker Questions.

Go over text comments in Maple and distinguishing work as your own.

We already saw examples of matrices with 0 solutions, via parallel planes, as well as 3 planes that just don't intersect concurrently:

implicitplot3d({x-2*y+z-2, x+y-2*z-3, (-2)*x+y+z-1}, x = -4 .. 4, y = -4 .. 4, z = -4 .. 4)

implicitplot3d({x+y+z-3, x+y+z-2, x+y+z-1}, x = -4 .. 4, y = -4 .. 4, z = -4 .. 4)

Register the i-clickers. Collect homework. Share from the syllabus or from class on Tuesday or hw (questions or what you learned)

Mention solutions on ASULearn and the fact that in solutions I often do much more than what the question asked you to do in order to help you understand the bigger-picture and/or diverse methods and perspectives.

Revisit the geometry using implicitplot3d, number of missing pivots, and parametrization of x+y+z=1 in R

Algebraic and geometric perspectives in 3-D and solving using by-hand elimination, and ReducedRowEchelon and GaussianElimination.

3 equations 2 unknowns with one solution in the plane R

3 equations 3 unknowns with infinite solutions, one solution and no solutions in R

Look at the geometry, number of missing pivots, and parametrization of x+y+z=1.

Mention homework and the class webpages