2240 class highlights
Fri Jun 29 Finish final presentations
Thur Jun 28 Begin final presentations
Wed Jun 27
Discuss the final project presentations. Divide up the sessions based on topics.
Work on project and/or test revisions.
Evaluations for those who missed Monday.
Tues Jun 26
Test 2.
Mon Jun 25
second review activity
success
Test 2 review, topics to study
Practice test, problem sets, hw problems, clickers, study guide topics, glossaries. Solutions exist for you to compare and learn from but be sure to try them on your own and make sure you can discuss the concepts and do the problems (linearly +)
independently!
final project presentations
evaluations
Fri Jun 22
Clicker in Chapter 5 #14-18
course goals
uncover the mystery of inverse(P).A.P=?,
Diagonalization and apply to computer graphics-
Applications to mathematical physics,
quantum chemistry...,
Standing wave,
Eigenfunction,
Tacoma Narrows
MathSciNet Hill cipher. Leontief. Search within matrix/matrices.
Google Scholar: eigenvalue in mathematics education research
full guidelines, topics and sample projects introduction to LaTex.
review activity
Thur Jun 21
Clicker in Chapter 5 #8-13
Continue Dynamical Systems and
Eigenvectors
comic http://brownsharpie.courtneygibbons.org/comic/guest-artist-little-pete-is-emo/
THE $25,000,000,000 EIGENVECTOR by Kurt Bryan and Tanya Leise
About once a month, Google finds an eigenvector of a
matrix that represents the connectivity of the web (of size
billions-by-billions) for its pagerank algorithm.
Eigenfeet, eigenfaces, eigenlinguistics
presentation session http://hosted.jalt.org/pansig/2005/HTML/Bayne.htm,
final research presentations
Hamburger earmuffs and the pickle matrix
full guidelines, topics and sample projects
rubric for the final project
Wed Jun 20
Review algebra of eigenvalues and eigenvectors and
Eigenvector decomposition
Clicker in Chapter 5 #2-5
Then continue and highlight predator prey, predator predator or cooperative systems
(where cooperation leads to sustainability) and #6-7.
Geometry of Eigenvectors
Ex1:=Matrix([[0,1],[1,0]]);
Eigenvalues(Ex1);
Eigenvectors(Ex1);
Ex2:=Matrix([[0,1],[-1,0]]);
Ex3:=Matrix([[-1,0],[0,-1]]);
Ex4:=Matrix([[1/2,1/2],[1/2,1/2]]);
Horizontal shear Matrix([[1,k],[0,1]])
Tues Jun 19
Take questions on 2.8. Engagement
algebra of eigenvalues and eigenvectors and connect to geometry
eigensheep comic
Eigenvalues of triangular matrices like shear matrix are on the diagonal-- characteristic equation.
Matrix([[2,1],[1,2]])
M := Matrix([[2,1],[1,2]]);
Eigenvectors(M);
Begin 5.6: Eigenvector decomposition for a diagonalizable
matrix A_nxn [where the eigenvectors form a basis for all of Rn].
M := Matrix([[6/10,4/10],[-125/1000,12/10]]);
Eigenvectors(M);
Application: Foxes and Rabbits
Also revisit the black hole matrix.
Clicker in Chapter 5 #1
Compare with Dynamical Systems and Eigenvectors first example
Mon Jun 18
Clicker questions in chapter 3 #10
If space is the final frontier, then what's a subspace?
subspace Paramount and CBS,
basis, null space and column space
nullspace null=me!
clickers in 2.8 1-3
algebra of eigenvalues and eigenvectors and connect to geometry
Fri Jun 15
Clicker questions in 2.7 #7-9
Review Laplace expansion of the determinant LaTex Beamer slides
The determinator comic, which has lots of 0s,
review row operations and determinants
The relationship of row operations to the
geometry of determinants -
shear matrices preserve area, volume.
Clicker questions in chapter 3#4-9
If space is the final frontier, then what's a subspace?
subspace Paramount and CBS
Engagement and exam corrections
Thur Jun 14 Test 1
Wed Jun 13
Clicker questions in chapter 3#1-3
2x2 and 3x3 diagonals methods and Laplace's expansion (1772 - expanding on Vandermonde's
method) method in general. [general history dates to the Chinese and Leibniz]
M:=Matrix([[a,b,c],[d,e,f],[g,h,i]]);
Determinant(M); MatrixInverse(M);
M:=Matrix([[a,b,c,d],[e,f,g,h],[i,j,k,l],[m,n,o,p]]);
Determinant(M); MatrixInverse(M);
LaTex Beamer slides
The connection of row operations to determinants
The determinant of A transpose and A triangular (such as in Gaussian form).
The determinant of A inverse via the determinant of the product of A and A inverse - and via elementary row operations - so det A non-zero can be added into Theorem 8 in Chapter 2: What Makes a Matrix Invertible.
Mention google searches: application of determinants in physics application of determinants in economics application of determinants in chemistry application of determinants in computer science Eight queens and determinants application of determinants in geology: volumetric strain
Moving activity: Glossary matchup
review slides,
study guide, sample partial test
Tues Jun 12
Review linear transformations of the plane,
including homogeneous coordinates
Computer graphics demo [2.7] Example 2
Clicker questions in 2.7 #1-2
rotation matrix and 6.1
Application of 2.7 and 6.1: Keeping a car on a
racetrack
Computer graphics demo [2.7] Examples 3-5
Begin Yoda (via the file yoda2.mw) with data from
Kecskemeti B. Zoltan (Lucasfilm LTD) as on
Tim's page
Clicker questions in 2.7 #4-6
Clicker questions in chapter 3#1-3
Mon Jun 11
List relevant examples and course overview
Clicker in 2.1-2.3 #20-22. Discuss
problem set and create a video.
Text 2 material: Glossary of terms and more glossary, clickers
Linear transformations continued.
Moving activity: Each odd person moves +4 (mod class size).
Guess the transformation.
VLA Package from Visual Linear Algebra by Herman and Pepe.
In the process, discuss that the first column of the matrix representation is the same as the output of the unit x vector, and that invertible matrices will take the plane to the plane (the range is onto the plane),
while matrices that are not invertible do not span the entire plane, so they smush the
plane (pictures in the plane, etc).
Mirror mirror comic http://digmi.org/tag/fun/page/2/ and
Sheared Sheap comic from our book
general geometric transformations on R2 [1.8, 1.9]
In the process, review the unit circle
Computer graphics demo [2.7] Example 1
Fri Jun 8
Show that if the columns of a square nxn matrix A span the entire R^n, then A is invertible.
Clicker in 2.1-2.3 #10-12
2.1 #23: Assume CA=I_nxn. A doesn't have to be square. 3x2 matrix A.
2.2 #21: Explain why the columns of an nxn matrix A are linearly independent when A is invertible.
problematic reasoning: If the 2 columns of A are multiples the determinant will be 0
incomplete reasoning: the columns of A are li because Ax=0 has only the trivial solution when A is invertible (why?).
Theorem 8 in 2.3 [without linear transformations]:
What makes a matrix invertible
List relevant examples and course overview
-2.1-2.3 Applications: Hill Cipher, Condition Number and Linear
Transformations (2.3, 1.8, 1.9 and 2.7)
Introduction to Linear Maps
Hill Cipher history
Maple file on Hill Cipher and
Condition Number and
PDF version
review of Hill cipher and condition number
Clicker in 2.1-2.3 #13-19
Thur Jun 7
Comic: associativity superpowers
2.2 Algebra: Inverse of a matrix.
Steps,
The Science of Successful Learning, learn something new
Divide up using cut up comics Clicker in 2.1-2.3 #5-9
Applications of multiplication and the inverse (if it exists)
Assume that A (square) has an inverse.
What else can you say?
Theorem 8 in 2.3 [without linear transformations]:
What makes a matrix invertible
Discuss what it means for a square matrix that violates one of the statements.
Discuss what it means for a matrix that is not square (all bets are off) via
counterexamples.
Pivot and matrix multipliction arguments: A invertible Ax=b solutions, Ax=0 solutions.
Wed Jun 6
Maple commands Maple file
file, Clicker question in 1.3, 1.4, 1.5, 1.7 #18-
Begin Chapter 2:
Image 1
Image 2
Image 3
Image 4
Image 5
Image 6
Image 7.
glossary for 2.1-2.3
Then
Clicker in 2.1-2.3 #1-4
matrix multiplication and
matrix algebra. AB not BA...
Introduce transpose of a matrix
via Wikipedia, including Arthur Cayley. Applications including least squares estimates, such as in linear regression, data given as rows (like Yoda).
twobytwo := Matrix([[a, b], [c, d]]);
MatrixInverse(twobytwo);
MatrixInverse(twobytwo).twobytwo
simplify(%)
comic. Find the identity of superman
2.2 Algebra: Inverse of a matrix.
Repeated methodology: multiply by the inverse on both sides,
reorder by
associativity, cancel A by its inverse, then reduce by the identity to
simplify.
Tues Jun 5
Review 1.5 and 1.3 and 1.7 vector and matrix equations,
Theorem in 1.7
Clicker question in 1.3, 1.4, 1.5, 1.7 #8-10
dependence comic
Roll
Yaw Pitch Gimbal lock on Apollo 11.
Break up via Random sequence generators
and Review 1.1, 1.2, 1.3, 1.4, 1.5, 1.7
Clicker question in 1.3, 1.4, 1.5, 1.7 #11-
Mon Jun 4 Discuss problem sets.
Review Clicker in 1.3-1.7 # 5 in Maple
discuss what happens when we correctly use GaussianElimination(s13n15extension) - write out the equation of the plane that the vectors span.
s13n15extension:=Matrix([[1,-5,b1],[3,-8,b2],[-1,2,b3]]);
GaussianElimination(s13n15extension);
M:=Matrix([[1,-5,0,b1],[3,-8,0,b2],[-1,2,1,b3]]);
GaussianElimination(M);
a:=spacecurve({[t, 3*t, -1*t, t = 0 .. 1]}, color = red, thickness = 2):
b:=spacecurve({[-5*t, -8*t, 2*t, t = 0 .. 1]}, color = blue, thickness
= 2):
diagonalparallelogram:=spacecurve({[-4*t, -5*t, -1*t, t = 0 .. 1]},
color = black, thickness = 2):
c:=spacecurve({[-4*t, -5*t, 3*t, t = 0 .. 1]}, color = magenta, thickness = 2):
display(a,b,c,diagonalparallelogram);
Modified the diagonal of the parallelogram by changing the last coordinate to get a vector out of the plane.
Review 1.3 and 1.4, and theorem 4 in 1.4.
1.5: vector parametrization equations of homogeneous and non-homogeneous equations.
parallelvectorline movie. Introduce t*vector1 + vector2 is the collection of vectors that end on the
line parallel to vector 1 and through the tip of vector 2.
Clicker in 1.3-1.7 # 6 and #7 to motivate 1.7
How to express redundancy?
1.3 and 1.7 vector and matrix equations
In R^2: spans R^2 but not li, li but does not span R^2, li plus spans R^2.
Theorem in 1.7
Fri Jun 1
Review vectors, addition, scalar multiplication, linear combinations and span of them, and movie visualizations: span2dmovie, spand3dmovie.
What's your span? comic
Clicker questions in 1.3, 1.4, 1.5, 1.7 # 3
Maple
span1:=Matrix([[1, 4, b1], [2, 5, b2], [3, 6, b3]]);
GaussianElimination(span1);
Comment on the span being b1-2b2+b3=0. Notice that Vector([7,8,9])
also satisfies this equation
a1:=spacecurve({[t, 2*t, 3*t, t = 0 .. 1]}, color = red, thickness = 2):
a2:=textplot3d([1, 2, 3, ` vector [1,2,3]`], color = black):
b1:=spacecurve({[4*t,5*t,6*t,t = 0 .. 1]}, color = green, thickness = 2):
b2:=textplot3d([4, 5, 6, ` vector [4,5,6]`], color = black):
c1:=spacecurve({[7*t, 8*t, 9*t, t = 0 .. 1]},color=magenta,thickness = 2):
c2:=textplot3d([7,8,9,`vector[7,8,9]`],color = black):
display(a1,a2,b1,b2,c1,c2);
Replace with [7, 8, 10] which is not in the span.
Begin 1.4 Ax via using weights from x for columns of A versus Ax via
dot products of rows of A with x and Ax=b the same (using definition 1 of
linear combinations of the columns) as the augmented matrix [A |b].
The matrix vector equation and the augmented matrix. The matrix vector equation and the augmented
matrix and the connection of mixing to span and linear combinations.
Theorem 4 in 1.4
Coff:=Matrix([[.3,.4,36],[.2,.3,26],[.2,.2,20],[.3,.1,18]]);
ReducedRowEchelonForm(Coff);
Coffraction:=Matrix([[3/10,4/10,36],[2/10,3/10,26],[2/10,2/10,20],[3/10,1/10,18]]);
ReducedRowEchelonForm(Coffraction);
Decimals (don't use in Maple) and fractions. Geometry
of the columns as a plane in R^4, of the rows as 4
lines in R^2 intersecting in the point (40,60).
Clicker in 1.3, 1.4, 1.5, 1.7 #4-5
Thur May 31
History of linear equations and the term "linear algebra"
images, including the Babylonians 2x2 linear
equations, the Chinese 3x3 column elimination method over 2000 years ago, Gauss' general
method arising from geodesy and least squares methods for celestial
computations, and Wilhelm Jordan's contributions.
Gauss was also involved in
other linear algebra, including the
history of vectors, another important "linear" object.
Advice from previous students
2240 engagement
clicker questions 1.1 and 1.2 continued #6 onward
Glossary 2: More Terms for Test 1
vectors, scalar mult and addition,
Foxtrot vector addition comic by
Bill Amend. November 14, 1999.
1.3 linear combinations and weights,
vector equations and connection to 1.1 and 1.2 systems of equations and
augmented matrix. linear combination language (addition and scalar
multiplication of vectors).
Begin clicker in 1.3-1.7
Wed May 30
Turn in hw.
Gaussian and Gauss-Jordan for
3 equations and 2 unknowns in R2.
Engagement with the i-clickers (Think, Pair up, Share, Review and Add), where to get
help, solutions and glossary on ASULearn. Exam 1 questions.
Clicker in 1.1 and 1.2 #1.
Gaussian and Gauss-Jordan or reduced
row echelon form in general:
section 1.2, focusing on algebraic and geometric perspectives
and solving using by-hand elimination of systems of equations with 3
unknowns. Follow up with
Maple commands and visualization: ReducedRowEchelon and
GaussianElimination as well as implicitplot3d in Maple (like on the
handout):
Parametrize x+y+z=1. Maple
Ex1:=Matrix([[1,-2,1,2],[1,1,-2,3],[-2,1,1,1]]);
implicitplot3d({x-2*y+z=2, x+y-2*z=3, (-2)*x+y+z=1}, x = -4 .. 4, y = -4 .. 4, z = -4 .. 4);
Ex2:=Matrix([[1,2,3,3],[2,-1,-4,1],[1,1,-1,0]]);
implicitplot3d({x+2*y+3*z=3,2*x-y-4*z=1,x+y-z=0},
x=-4..4,y=-4..4,z=-4..4);
Ex3:=Matrix([[1,2,3,0],[1,2,4,4],[2,4,7,4]]);
implicitplot3d({x+2*y+3*z = 0, x+2*y+4*z = 4, 2*x+4*y+7*z = 4}, x = -13 .. -5, y = -1/4 .. 1/4, z = 3 .. 5, color = yellow);
Ex4:=Matrix([[1,3,4,k],[2,8,9,0],[10,10,10,5],[5,5,5,5]]);
GaussianElimination(Ex4);
Ex4a:=Matrix([[1,3,4,k],[2,8,9,0],[10,10,10,5],[5,5,5,5]]);
GaussianElimination(Ex4a);
Highlight equations with 3 unknowns with infinite solutions, one solution
and no
solutions in R3, and the corresponding geometry, as we review
new terminology and glossary of terms
clicker questions 1.1 and 1.2 continued
Tues May 29
UTAustinXLinearAlgebra.mov
Course intro slides # 1 and 2
Work on the introduction to linear algebra handout motivated from
Evelyn Boyd Granville's favorite
problem (#1-3).
At the same time, begin 1.1 (and some of the words in 1.2)
including geometric perspectives,
by-hand algebraic EBG#3,
Gaussian Elimination and EBG #5 and pivots,
solutions, plotting and geometry, parametrization and GaussianElimination
in Maple for systems with 2 unknowns in R2.
Evelyn Boyd Granville #3:
with(LinearAlgebra): with(plots):
implicitplot({x+y=17, 4*x+2*y=48},x=-10..10, y = 0..40);
EBG3:=Matrix([[1,1,17],[4,2,48]]);
GaussianElimination(EBG3);
ReducedRowEchelonForm(EBG3);
In addition, do #4
Evelyn Boyd Granville #4: using the slope of the lines, versus full
pivots in Gaussian (r2'=-4 r1 + r2):
EBG4:=Matrix([[1,1,a],[4,2,b]]);
GaussianElimination(EBG4);
Course intro slides continued.
How to get to the main calendar page: google Dr. Sarah /
click on webpage / then 2240. Discuss webpages, homework and
Polya's How to Solve it
Vocabulary/terms/ASULearn glossary
Evelyn Boyd Granville #5 with
k as an unknown but constant coefficient.
EBG#3,
Gaussian Elimination and EBG #5
EBG5:=Matrix([[1,k,0],[k,1,0]]);
GaussianElimination(EBG5);
ReducedRowEchelonForm(EBG5);
Prove using geometry of lines
that the number of solutions of a system
with 2 equations and 2 unknowns is 0, 1 or infinite.
Review Gaussian and Gauss-Jordan for 3
equations and 2 unknowns in R2.
Drawing the line comic.
Solve the system x+y+z=1 and x+y+z=2 (0 solutions - 2 parallel planes)
implicitplot3d({x+y+z=1, x+y+z=2}, x = -4 .. 4,
y = -4 .. 4, z = - 4 .. 4)