2240 class highlights
Fri Jun 24 Final Project presentations
Thur Jun 23 Final Project presentations
Wed Jun 22
Test revisions
Share the final research presentations topic (name, major(s), concentrations/minors, research project idea, and whether you prefer to go 1st, 2nd or have no preference).
Reflection
Rubric for the final project
Tues Jun 21 Test 2. Spend the remaining time on the final project.
Mon Jun 20
Review for test 3. Take questions on the study guide.
final research presentations
rubric for the final project
Evaluations.
Fri Jun 17
Clicker questions---review of eigenvectors
THE $25,000,000,000 EIGENVECTOR by Kurt Bryan and Tanya Leise:
When Google went online in the late 1990's, one thing that set it apart
from other search engines was that its search result listings always seemed
deliver the "good stuff"
up front. With other search engines you often had to wade through screen
after screen of links
to irrelevant web pages that just happened to match the search text. Part of
the magic behind
Google is its PageRank algorithm, which quantitatively rates the importance
of each page on the
web, allowing Google to rank the pages and thereby present to the user the
more important (and
typically most relevant and helpful) pages first.
About once a month, Google finds an eigenvector of a
matrix that represents the connectivity of the web (of size
billions-by-billions) for its pagerank algorithm.
http://languagelog.ldc.upenn.edu/nll/?p=3030
Big picture discussion
final research presentations
Chinese, German Gauss, French Laplace, German polymath Hermann Grassman (1809-1877) 1844: The Theory of Linear Extension, a New Branch of Mathematics
(extensive magnitudes---effectively linear space via linear combinations, independence, span, dimension, projections.)
sample project,
full guidelines
rubric for the final project
April was Mathematics Awareness Month on The Future of Prediction
Making a matrix disappear and then reappear
Look at
h,P:=Eigenvectors(A)
MatrixInverse(P).A.P
which (ta da) has the eigenvalues on the
diagonal (when the columns of P form a basis for Rn)-
diagonalizability.
[We can uncover the mystery
and apply this to computer
graphics].
Applications to mathematical physics,
quantum chemistry...
Eigenfunction
Tacoma Narrows
Thur Jun 16
Review: the algebra of eigenvectors and eigenvalues
Review trajectories from the glossary.
Geometry of Eigenvectors examples 1 and 2 and compare with Maple
>Ex2:=Matrix([[0,1],[-1,0]]);
>Ex3:=Matrix([[-1,0],[0,-1]]);
>Ex4:=Matrix([[1/2,1/2],[1/2,1/2]]);
Horizontal shear Matrix([[1,k],[0,1]]) and via det (A-lamda I)=0. Once given lambda, what is the eigenvector?
Clicker questions---
eigenvector decomposition (5.6) part 2
Fill in examples on Terms for Test 3
Dynamical Systems and
Eigenvectors remaining examples
final research presentations
Hamburger earmuffs and the pickle matrix
Wed Jun 15
Clicker questions in Chapter 3 #9.
Test 2 corrections
Review: the algebra of eigenvectors and eigenvalues
Clicker questions in 5.1 #1-3
eigensheep comic
Eigenvector decomposition
Application: Foxes and Rabbits
Also revisit the black hole matrix.
Compare with Dynamical Systems and
Eigenvectors first example
Clicker questions on
eigenvector decomposition (5.6) part 1#1-2
Highlight predator prey, predator predator or cooperative systems
(where cooperation leads to sustainability)
Eigenvector comic 2
Clicker questions on
eigenvector decomposition (5.6) part 1#3-4
Review reflection across y=x line
via pictures. A few inputs. Where is the output? Is the vector
an eigenvector?
>Ex1:=Matrix([[0,1],[1,0]]);
>Eigenvalues(Ex1);
>Eigenvectors(Ex1);
Geometry of Eigenvectors examples 1 and 2 and compare with Maple
>Ex2:=Matrix([[0,1],[-1,0]]);
Tues Jun 14 Test 2. Resume class:
Eigenvalues and applications (5.1, 5.2 and 5.6)
Begin 5.1: the algebra of eigenvectors and eigenvalues and connect to geometry and Maple.
Eigenvalues of triangular matrices like shear matrix are on the diagonal-- characteristic equation.
Matrix([[2,1],[1,2]])
M := Matrix([[2,1],[1,2]]);
Eigenvectors(M);
Eigenvector comic 1
Begin 5.6: Eigenvector decomposition for a diagonalizable matrix A_nxn [where the eigenvectors form a basis for all of Rn].
M := Matrix([[6/10,4/10],[-125/1000,12/10]]);
Eigenvectors(M);
Application: Foxes and Rabbits
Mon Jun 13
Overview of new material for test 2, study guide and take questions.
If space is the final frontier, then what's a subspace?
subspace,
basis, null space and column space
2.8 using the matrix 123,456,789 and finding the Nullspace and
ColumnSpace (using 2 methods - reducing the spanning equation with a vector
of b1...bn, and separately by examining the pivots of the ORIGINAL matrix.)
Add to the terms. Two other examples.
Applications of 2.8
nullspace
Clicker questions in 2.8
Fri Jun 10
Review linear transformations of the plane, including homogeneous coordinates
Review 2.7 7 and 9
Review determinants LaTex Beamer slides
Mention google searches: application of determinants in physics application of determinants in economics application of determinants in chemistry application of determin ants in computer science Eight queens and determinants application of determinants in geology: volumetric strain
Clicker questions in Chapter 3 #4-8, 10
3.3 p. 180-181:
The relationship of row operations to the
geometry of determinants - row operations can be seen as vertical shear matrices when written as elementary matrix form, which preserve area, volume, etc.
Catalog description: A study of vectors, matrices and linear transformations, principally in two and three dimensions,
including treatments of systems of linear equations, determinants,
and eigenvalues.
If space is the final frontier, then what's a subspace?
subspace,
Thur Jun 9
Review linear transformations of the plane, including homogeneous coordinates
Begin Yoda (via the file yoda2.mw) with data from
Kecskemeti B. Zoltan (Lucasfilm LTD) as on
Tim's page
Clicker questions in 2.7 #7 and 8
Clicker questions in Chapter 3 #1-3
Chapter 3 in Maple via MatrixInverse command for 2x2 and 3x3 matrices and
then determinant work, including 2x2 and 3x3 diagonals methods,
and Laplace's expansion (1772 - expanding on Vandermonde's
method) method in general. [general history dates to Chinese and Leibniz]
M:=Matrix([[a,b,c],[d,e,f],[g,h,i]]);
Determinant(M); MatrixInverse(M);
M:=Matrix([[a,b,c,d],[e,f,g,h],[i,j,k,l],[m,n,o,p]]);
Determinant(M); MatrixInverse(M);
glossary of terms
LaTex Beamer slides
Review the diagonal determinant methods for the 123,456,789 matrix and introduce the Laplace expansion. Review that for 4x4 matrix in Maple, only Laplace's method will work.
The
determinator comic, which has lots of 0s
The connection of row operations to determinants
The determinant of A transpose and A triangular (such as in Gaussian form).
The determinant of A inverse via the determinant of the product of A and A inverse - and via elementary row operations - so det A non-zero can be added into Theorem 8 in Chapter 2: What Makes a Matrix Invertible.
Wed Jun 8
Clicker 2.3 review
Go over 2.3 #11c and 12e on solutions.
Clicker questions in 2.7 #1.
review linear transformations
Computer graphics demo [2.7] Examples 1-2
Clicker questions in 2.7 #2-6
Computer graphics demo [2.7] Examples 3-5
Keeping a car on a racetrack
Clicker questions in 2.7 #7
Tues Jun 7
Go over 2.1 number 23.
Go over Hill cipher and condition
number
Clicker questions in 2.3 and Hill Cipher
and Condition Number
Comic: associativity superpowers
Applications of 2.1-2.3:
1.8 (p. 62, 65, & 67-68), 1.9 (p. 70-75), and 2.7
Guess the transformation.
In the process, discuss that the first column of the matrix representation is the same as the output of the unit x vector, and that invertible matrices will take the plane to the plane (the range is onto the plane), while matrices that are not invertible do not span the entire plane, so they smush the plane (pictures in the plane, etc).
Mirror mirror comic and
Sheared Sheap comic
general geometric transformations on
R2 [1.8, 1.9]
In the process, review the unit
circle
Mon Jun 6
Clicker questions in 2.2 #1
Test 1 corrections,
day 1 slides.
Review 2.1 #21
Clicker in 2.1 continue with #8
In groups of 2-3 people, assume that A (square) has an inverse.
What else can you say?
Theorem 8 in 2.3 [without linear transformations]: What makes a matrix invertible
Discuss what it means for a square matrix that violates one of the
statements. Discuss what it means for a matrix that is not square (all bets
are off) via counterexamples.
-2.1-2.3 Applications: Hill Cipher, Condition Number and Linear
Transformations (2.3, 1.8, 1.9 and 2.7)
Applications: Introduction to Linear Maps
The black hole matrix: maps R^2 into the plane but not onto (the range
is the 0 vector).
Dilation by 2 matrix
Linear transformations in the cipher setting:
A |
B |
C |
D |
E |
F |
G |
H |
I |
J |
K |
L |
M |
N |
O |
P |
Q |
R |
S |
T |
U |
V |
W |
X |
Y |
Z |
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 |
10 |
11 |
12 |
13 |
14 |
15 |
16 |
17 |
18 |
19 |
20 |
21 |
22 |
23 |
24 |
25 |
26 |
Applications of 2.1-2.3: Linear
transformations in the cipher setting and finish 2.3 via the condition number.
Hill Cipher history
Maple file on Hill Cipher and
Condition Number and
PDF version
review of Hill cipher and condition number
Fri Jun 3
Test 1
glossary of terms
multiply comic
2.2 Algebra: Inverse of a matrix.
Repeated methodology: multiply by the inverse on both sides, reorder by associativity, cancel A by its inverse, then reduce by the identity to simplify:
Applications of multiplication and the inverse (if it exists)
Clicker in 2.1 continue
with #7
Thur Jun 2
Review Test 1 review part 1
Test 1 review part 2 and take questions
on the study guide
Review matrix addition, scalar multiplication and transpose and
matrix multiplication.
matrix algebra. AB not BA...
2.2: Multiplicative Inverse for 2x2 matrix:
twobytwo := Matrix([[a, b], [c, d]]);
MatrixInverse(twobytwo);
MatrixInverse(twobytwo).twobytwo
simplify(%)
2.2 Algebra: Inverse of a matrix.
Repeated methodology: multiply by the inverse on both sides, reorder by associativity, cancel A by its inverse, then reduce by the identity to simplify:
Wed Jun 1
Finish Test 1 review part 1
Begin Chapter 2:
via Clicker questions in 2.1 1-4
Image 1
Image 2
Image 3
Image 4
Image 5
Image 6
Image 7.
Continue matrix algebra
via Clicker questions in 2.1
5 and 6 in LaTeX.
matrix multiplication
Introduce transpose of a matrix via Wikipedia,
including Arthur Cayley. Applications including least squares estimates, such as in linear regression, data given as rows (like Yoda).
Tues May 31
dependence comic
Roll Yaw Pitch
Gimbal lock on Apollo 11.
clicker review questions
Maple commands
Test 1 review part 1
Mon May 30
Clicker question in 1.3 and 1.5 #4
discuss what happens when we correctly use GaussianElimination(s13n15extension) - write out the equation of the plane that the vectors span.
s13n15extension:=Matrix([[1,-5,b1],[3,-8,b2],[-1,2,b3]]);
GaussianElimination(s13n15extension);
Choose a vector that violates this equation to span all of R^3 instead
of the plane and plot:
M:=Matrix([[1,-5,0,b1],[3,-8,0,b2],[-1,2,1,b3]]);
GaussianElimination(M);
a:=spacecurve({[t, 3*t, -1*t, t = 0 .. 1]}, color = red, thickness = 2):
b:=spacecurve({[-5*t, -8*t, 2*t, t = 0 .. 1]}, color = blue, thickness
= 2):
diagonalparallelogram:=spacecurve({[-4*t, -5*t, -1*t, t = 0 .. 1]},
color = black, thickness = 2):
c:=spacecurve({[0, 0, t, t = 0 .. 1]}, color = magenta, thickness = 2):
display(a,b,c,diagonalparallelogram);
1.5: vector parametrization equations of homogeneous and
non-homogeneous equations. Introduce t*vector1 + vector2 is the collection
of vectors that end on the line parallel to vector 1 and through the tip of
vector 2
Clicker question in 1.3 and 1.5 #5
Clicker question to motivate 1.7
How to express redundancy?
1.7 definition of linearly independent and
connection to efficiency of span
Fill in glossary
In R^2: spans R^2 but not li, li but does not span R^2, li plus spans R^2.
Clicker questions in 1.7 and the theorem about l.i. equivalences in 1.7
Fri May 27
Collect problem set 1. Review the language of vectors, scalar mult and addition, linear combinations and weights, vector equations and connection to 1.1 and 1.2 systems of equations and augmented matrix, and span.
span1:=Matrix([[1, 4, b1], [2, 5, b2], [3, 6, b3]]);
GaussianElimination(span1);
Comment on the span being b1-2b2+b3=0. Notice that Vector([7,8,9]) also satisfies this equation, and we can turn the plane they are in "head on" in Maple in order to see that no 2 lie on the same line but all are in the same plane:
a1:=spacecurve({[t, 2*t, 3*t, t = 0 .. 1]}, color = red, thickness = 2):
a2:=textplot3d([1, 2, 3, ` vector [1,2,3]`], color = black):
b1:=spacecurve({[4*t,5*t,6*t,t = 0 .. 1]}, color = green, thickness = 2):
b2:=textplot3d([4, 5, 6, ` vector [4,5,6]`], color = black):
c1:=spacecurve({[7*t, 8*t, 9*t, t = 0 .. 1]},color=magenta,thickness = 2):
c2:=textplot3d([7,8,9,`vector[7,8,9]`],color = black):
display(a1,a2,b1,b2,c1,c2);
Replace with [7, 8, 10] which is not in the span.
Clicker questions in 1.3 and 1.5
# 1, 2
What's your span? comic.
Clicker questions in 1.3 and 1.5
# 3-4
Begin 1.4. Ax via using weights from x for columns of A versus Ax via
dot products of rows of A with x and Ax=b the same (using definition 1 of
linear combinations of the columns) as the augmented matrix [A |b]. The matrix vector equation and the augmented matrix. The matrix vector equation and the augmented
matrix and the connection of mixing to span and linear combinations.
Theorem 4 in 1.4
Clicker question in 1.4
Coff:=Matrix([[.3,.4,36],[.2,.3,26],[.2,.2,20],[.3,.1,18]]);
ReducedRowEchelonForm(Coff);
Coffraction:=Matrix([[3/10,4/10,36],[2/10,3/10,26],[2/10,2/10,20],[3/10,1/10,18]]);
ReducedRowEchelonForm(Coffraction);
Decimals (don't use in Maple) and fractions.
Geometry of the columns as a plane in R^4, of the rows as 4 lines in R^2 intersecting in the point (40,60).
Thur May 26
Review the algebra and geometry of eqs
with 3 unknowns in R^3.
Clicker questions 1.1 and 1.2 #3
onwards
History of linear equations and the term "linear algebra"
images, including the Babylonians 2x2 linear
equations, the
Chinese 3x3 column elimination method over 2000 years ago, Gauss' general
method arising from geodesy and least squares methods for celestial
computations, and Wilhelm Jordan's contributions.
Gauss quotation. Gauss was also involved in
other linear algebra, including the
history of vectors, another important "linear" object.
Glossary 2: More Terms for Test 1
vectors, scalar mult and addition,
Foxtrot vector addition comic by
Bill Amend. November 14, 1999.
1.3 linear combinations and weights,
vector equations and connection to 1.1 and 1.2 systems of equations and
augmented matrix. linear combination language (addition and scalar
multiplication of vectors).
c1*vector1 + c2*vector2_on_a_different_line is a plane via:
span1:=Matrix([[1, 4, b1], [2, 5, b2], [3, 6, b3]]);
GaussianElimination(span1);
Comment on the span being b1-2b2+b3=0. Notice that Vector([7,8,9])
also satisfies this equation
a1:=spacecurve({[t, 2*t, 3*t, t = 0 .. 1]}, color = red, thickness = 2):
a2:=textplot3d([1, 2, 3, ` vector [1,2,3]`], color = black):
b1:=spacecurve({[4*t,5*t,6*t,t = 0 .. 1]}, color = green, thickness = 2):
b2:=textplot3d([4, 5, 6, ` vector [4,5,6]`], color = black):
c1:=spacecurve({[7*t, 8*t, 9*t, t = 0 .. 1]},color=magenta,thickness = 2):
c2:=textplot3d([7,8,9,`vector[7,8,9]`],color = black):
display(a1,a2,b1,b2,c1,c2);
Replace with [7, 8, 10] which is not in the span.
Clicker questions in 1.3 and 1.5
# 1, 2
Web May 25
Turn in hw. Register the i-clickers.
Engagement with the the i-clickers
Gaussian and Gauss-Jordan for
3 equations and 2 unknowns in R2.
Clicker on 3eqs 2 vars
Clicker questions 1.1 and 1.2 #1.
Mention engagement, solutions and a glossary on ASULearn.
Gaussian and Gauss-Jordan or reduced
row echelon form in general:
section 1.2, focusing on algebraic and geometric perspectives
and solving using by-hand elimination of systems of equations with 3
unknowns. Follow up with
Maple commands and visualization: ReducedRowEchelon and
GaussianElimination as well as implicitplot3d in Maple (like on the
handout):
Parametrize x+y+z=1.
implicitplot3d({x+y+z=1}, x = -4 .. 4, y = -4 .. 4, z = - 4 .. 4);
implicitplot3d({x+y+z=1, x+y+z=2}, x = -4 .. 4,
y = -4 .. 4, z = - 4 .. 4);
Parametrize x+y+z=1.
with(plots): with(LinearAlgebra):
Ex1:=Matrix([[1,-2,1,2],[1,1,-2,3],[-2,1,1,1]]);
implicitplot3d({x-2*y+z=2, x+y-2*z=3, (-2)*x+y+z=1}, x = -4 .. 4, y = -4 .. 4, z = -4 .. 4);
Ex2:=Matrix([[1,2,3,3],[2,-1,-4,1],[1,1,-1,0]]);
implicitplot3d({x+2*y+3*z=3,2*x-y-4*z=1,x+y-z=0},
x=-4..4,y=-4..4,z=-4..4);
Ex3:=Matrix([[1,2,3,0],[1,2,4,4],[2,4,7,4]]);
implicitplot3d({x+2*y+3*z = 0, x+2*y+4*z = 4, 2*x+4*y+7*z = 4}, x = -13 .. -5, y = -1/4 .. 1/4, z = 3 .. 5, color = yellow);
Ex4:=Matrix([[1,3,4,k],[2,8,9,0],[10,10,10,5],[5,5,5,5]]);
GaussianElimination(Ex4);
Ex4a:=Matrix([[1,3,4,k],[2,8,9,0],[10,10,10,5],[5,5,5,5]]);
GaussianElimination(Ex4);
Highlight equations with 3 unknowns with infinite solutions, one solution
and no
solutions in R3, and the corresponding geometry, as we review
new terminology and glossary of terms.
Tues May 24
UTAustinXLinearAlgebra.mov. Manga comic
Course intro slides # 1 and 2
Work on the introduction to linear algebra handout motivated from
Evelyn Boyd Granville's favorite
problem (#1-3).
At the same time, begin terms in 1.1 (and some of the words in 1.2)
including geometric perspectives,
by-hand algebraic EBG#3,
Gaussian Elimination and EBG #5 and pivots,
solutions, plotting and geometry, parametrization and GaussianElimination
in Maple for systems with 2 unknowns in R2.
Evelyn Boyd Granville #3:
with(LinearAlgebra): with(plots):
implicitplot({x+y=17, 4*x+2*y=48},x=-10..10, y = 0..40);
EBG3:=Matrix([[1,1,17],[4,2,48]]);
GaussianElimination(EBG3);
ReducedRowEchelonForm(EBG3);
In addition, do #4
Evelyn Boyd Granville #4: using the slope of the lines, versus full
pivots in Gaussian (r2'=-4 r1 + r2):
EBG4:=Matrix([[1,1,a],[4,2,b]]);
GaussianElimination(EBG4);
Course intro slides last few slides
Evelyn Boyd Granville #5 with
k as an unknown but constant coefficient.
EBG#3,
Gaussian Elimination and EBG #5
EBG5:=Matrix([[1,k,0],[k,1,0]]);
GaussianElimination(EBG5);
ReducedRowEchelonForm(EBG5);
Prove using geometry of lines
that the number of solutions of a system
with 2 equations and 2 unknowns is 0, 1 or infinite.
How to get to the main calendar page: google Dr. Sarah /
click on webpage / then 2240. Online HW.
MyMathLab
Review Gaussian and Gauss-Jordan for 3
equations and 2 unknowns in R2.
Drawing the line comic.
Solve the system x+y+z=1 and x+y+z=2 (0 solutions - 2 parallel planes)
implicitplot3d({x+y+z=1, x+y+z=2}, x = -4 .. 4,
y = -4 .. 4, z = - 4 .. 4)