2240 class highlights

  • Fri May 10 9-11:30am final research sessions


  • Thur May 2
    1. Write down and turn in your topic and name(s) [one per group]
    2. Fill out the "Planning for the Future of Math 2240" handout and turn it in up front in the envelope. Do NOT list your name.
    3. Take any questions on test revisions or the final research sessions
    4. Discuss the Final Research sessions - share topics with each other, what session each person is in, and peer review.
    5. Upper level courses I teach include Differential Geometry MAT 4140, Senior Capstone MAT 4040, Instructional Assistant MAT 3520
    6. Course evaluations
  • Tues Apr 30
    Take questions on the final project
    Look at MatrixInverse(P).A.P, which has the eigenvalues on the diagonal - definition of diagonalizability and similarity.
    Derivation that for eigenvectors x for A, Akx = lambda kx
    Derivation that A P = P times the diagonal matrix of eigenvalues [which is how we showed that MatrixInverse(P).A.P = Diag]


    Execute in Maple:
    A:=Matrix([[(cos(theta))^2,cos(theta)*sin(theta)],[cos(theta)*sin(theta), ((sin(theta))^2)]]);
    h,P:=Eigenvectors(A)
    Diag:=simplify(MatrixInverse(P).A.P);
    What geometric transformation is Diag?
    Notice that P.Diag.MatrixInverse(P) = A by matrix algebra.
    Writing out a transformation in terms of a P, the inverse of P, and a diagonal matrix will prove very useful in computer graphics [Recall that we read matrix composition from right to left].
    Geometric intuition of         P.Diag.MatrixInverse(P) = A
    If we want to project a vector onto the y=tan(theta) x line, first we can perform MatrixInverse(P) which takes a vector and rotates it counterclockwise by theta. Next we perform Diag, which projects onto the x-axis, and finally we perform P, which rotates clockwise by theta
    Linear Transformations
    Mention the spectrum, the spectrum of the Laplacian [divergence of gradient], heat equation...
  • Thur Apr 25 Test 3
  • Tues Apr 23 Collect hw and discuss.
    If the reduced augmented matrix for the system (A-lambdaI)x=0 is Matrix([[0,0,0],[0,0,0]]) then the (real) eigenvectors of A are:
    a) Just the 0 vector works
    b) A line through the origin
    c) All of R2
    d) A subspace of R3 (with 3 coordinates)
    e) None of the above

    True or False:
    a) True
    b) False and I have a correction
    c) False and I have a counterexample
    d) False and I have both a correction and a counterexample
    e) False but I have neither a correction nor a counterexample
  • Thur Apr 18
    #6-8 in 2.8 clicker questions.
    Chap 5 clicker questions

  • Tues Apr 16
    In Maple execute
    Eigenvectors(Matrix([[1,2],[2,1]])); and
    ReducedRowEchelonForm(Matrix([[1,1,1],[1,-1,5]]))
    Eigenvector clicker questions.
    Explain why the eigenvectors of Matrix([[1,2],[2,1]]) satisfy the definitions of span and li by setting up the corresponding equations and solving.
    li := [P|Vector([0,0])]
    span:=[P|Vector([a,b])]
    Eigenvector decomposition for a diagonalizable matrix A [where the eigenvectors form a basis]
    Foxes and Rabbits demo on ASULearn
    Dynamical Systems and Eigenvectors on ASULearn

  • Thur Apr 11 Take questions on the homework. #5 in 2.8 clicker questions.
    Define eigenvalues and eigenvectors [Ax=lambdax, vectors that are scaled on the same line through the origin, matrix multiplication is turned into scalar multiplication].
    Geometry of Eigenvectors.
    Algebra: Show that we can solve using det(lambdaI-A)=0 and (lambdaI-A)x=0.
    Compute the eigenvectors of Matrix([[0,1],[1,0]] by-hand and compare with Maple's work.
    Eigenvectors and eigenvalues of Matrix([[1,2],[2,1]) in Maple.

  • Tues Apr 9 #1 in 2.8 clicker questions.
    Begin 2.8 in order to lead to eigenvalues and applications (2.8, 4.9 and 5.1, 5.2, 5.3 and 5.6 selections, 7.1 as time allows).
  • Thur Apr 4 Take questions on the hw.
    Continue determinant work via the relationship of row operations to the geometry of determinants via a demo on ASULearn. Prove that det A non-zero can be added into Theorem 8 in Chapter 2. Algebraic and geometric derivations related to the determinant.
    Continue clicker questions on inverses and determinants
  • Thur Mar 28 Test 2
  • Tues Mar 26 Take questions on test 2. clicker questions on inverses and determinants, reviewing Laplace's expansion method, connections to the theorem in chapter 2...
  • Thur Mar 21
    Discuss Yoda via the file yoda2.mw with data from Lucasfilm LTD as on Tim's Page which has the data.
    Begin chapter 3 via a google search:
    application of determinants in physics
    application of determinants in economics
    application of determinants in chemistry
    application of determinants in computer science
    Eight queens and determinants Chapter 3 in Maple via MatrixInverse command for 2x2 and 3x3 matrices and then determinant work, including 2x2 and 3x3 diagonals methods, and Laplace's expansion method in general.
  • Tues Mar 19 Clicker questions Computer graphics continued, including the benefit of derivatives and unit length vectors in keeping a car on a race track - demo on ASULearn.
  • Thur Mar 7
    Finish the last guess the Transformation on ASULearn [1.8, 1.9]
    Review the unit circle
    general geometric transformations on R2 [1.8, 1.9]
    Computer graphics Demo on ASULearn [2.7]
  • Tues Mar 5 Review guidelines for Problem Sets, including
  • You have more time to work on fewer problems than practice exercises - Maple, interesting applications...
  • Counterexamples for false statements [If A then B counterexample: A is true but the conclusion B is false]
  • Annotated work / explanations that show your critical reasoning
  • In 2.3 # 12, in the instructions before 11, A is given as nxn
  • In the Condition Number problem, be careful of my additional instructions (inverse method with fractions...)

    Computer graphics and linear transformations (1.8, 1.9, 2.3 and 2.7)
    Begin with dilations
    Revisit Theorem 8 in 2.3 by incorporating the language of linear transformations [while also covering 1-1 and onto material in 1.9]
  • Thur Feb 28
    2.3 clicker questions
    2.1 clicker questions #10


    Catalog description: A study of vectors, matrices and linear transformations, principally in two and three dimensions, including treatments of systems of linear equations, determinants, and eigenvalues.

            -2.1-2.3 Applications: Coding, Condition Number and Linear Transformations (2.3, 1.8, 1.9 and 2.7)
            -Chapter 3 determinants and applications
            -Eigenvalues and applications (2.8, 4.9 and chap 5 selections, 7.1... as time allows)
            -Final research sessions [research a topic related to the course that you are interested in]

    Hill Cipher
    A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26

    Condition # of matrices
    Maple file on Coding and Condition Number and PDF version

  • Tues Feb 26
    Finish 2.2. 2.2 clicker questions.
    Begin 2.3.
  • Thur Feb 21 Continue with 2.1 and 2.2. Transpose of a matrix via Wikipedia, including Arthur Cayley. Applications including least squares estimates, such as in linear regression, data given as rows (like Yoda). Continue 2.1 clicker questions #6-9. Inverse of a matrix. twobytwo := Matrix([[a, b], [c, d]]);
    MatrixInverse(twobytwo);
    three := Matrix([[a, b, c], [d, e, f], [g, h, i]]);
    scalerow3 := Matrix([[1, 0, 0], [0, 5, 0], [0, 0, 1]]);
    scalerow3.three;
    swaprows13 := Matrix([[0, 0, 1], [0, 1, 0], [1, 0, 0]]);
    swaprows13.three;
    usualrowop := Matrix([[1, 0, 0], [0, 1, 0], [-2, 0, 1]]);
    usualrowop.three; corrections
  • Tues Feb 19 Test 1
  • Thur Feb 14
    Problem Set 2 clicker questions Hand out the study guide and take questions on test 1.
  • Tues Feb 12
    Continue via 2.1 clicker questions
    Powerpoint file.
    Matrix multiplication
    Matrix algebra
    Linear maps
    Algebra of matrix multiplication: AB and BA...

    End of Material for Test 1
  • Thur Feb 7 Take questions on the ASULearn solutions. Continue Chap 1 review clicker questions
    Image 1   Image 2   Image 3   Image 4   Image 5   Image 6   Image 7.
    Problem Set 2 clicker questions
  • Tues Feb 5 Collect hw and take questions. definitions Span: represent. Linearly Independent: efficiency
    In R^2, span but not li, li but not span, li plus span. R^3.
    Coffee mixing clicker questions
    Chap 1 review clicker questions
  • Thur Feb 1 Collect hw and take questions.
    1.5: vector parametrization equations of homogeneous and non-homogeneous equations.
    1.1-1.4 clicker questions
  • Tues Jan 29 Collect hw. 1.4
  • Thur Jan 24 Collect problem set 1.
    History of linear equations and the term "linear algebra" images, including the Babylonians 2x2 linear equations, the Chinese 3x3 column elimination method over 2000 years ago, Gauss' general method arising from geodesy and least squares methods for celestial computations, and Wilhelm Jordan's contributions.
    Gauss quotation. Gauss was also involved in other linear algebra, including the history of vectors, another important "linear" algebra.
    vectors, scalar mult and addition, linear combinations and weights, vector equations and connection to 1.1 and 1.2 systems of equations and augmented matrix, span
    1.3 clicker questions 1, 2, 4, and and 6.
  • Tues Jan 22 Collect homework. Take questions.
    1.1 and 1.2 Clicker Questions.
    Go over text comments in Maple and distinguishing work as your own.
    We already saw examples of matrices with 0 solutions, via parallel planes, as well as 3 planes that just don't intersect concurrently:
    implicitplot3d({x-2*y+z-2, x+y-2*z-3, (-2)*x+y+z-1}, x = -4 .. 4, y = -4 .. 4, z = -4 .. 4)
    implicitplot3d({x+y+z-3, x+y+z-2, x+y+z-1}, x = -4 .. 4, y = -4 .. 4, z = -4 .. 4)
  • Thur Jan 17
    Register the i-clickers. Collect homework. Share from the syllabus or from class on Tuesday or hw (questions or what you learned)
    Mention solutions on ASULearn and the fact that in solutions I often do much more than what the question asked you to do in order to help you understand the bigger-picture and/or diverse methods and perspectives.
    Revisit the geometry using implicitplot3d, number of missing pivots, and parametrization of x+y+z=1 in R3.
    Algebraic and geometric perspectives in 3-D and solving using by-hand elimination, and ReducedRowEchelon and GaussianElimination.
    3 equations 2 unknowns with one solution in the plane R2,
    3 equations 3 unknowns with infinite solutions, one solution and no solutions in R3.

  • Tues Jan 15 Fill out the information sheet and work on the introduction to linear algebra handout motivated from Evelyn Boyd Granville's favorite problem. At the same time, begin 1.1 and 1.2 including geometric perspectives, by-hand algebraic Gaussian Elimination and pivots, solutions, plotting and geometry, parametrization and GaussianElimination in Maple. In addition, do #5 with k as an unknown but constant coefficient. Prove using geometry of lines that the number of solutions of a system with 2 equations and 2 unknowns is 0, 1 or infinite.
    Look at the geometry, number of missing pivots, and parametrization of x+y+z=1.
    Mention homework and the class webpages