# Math 51 — Linear Algebra and Differential Calculus of Several Variables

You can find the main course website for Winter 2010 here.
Uncollected assignments and exams have been placed in the assignment drop box outside my office to be picked up at your leisure.

Grade distributions: midterm 1, midterm 2, final exam.

### Office hours

• Mondays — 9:00–10:30am
• Fridays — 9:00–10:30am

### Section notes

New! Fun definitions with remote-controlled robots can be found here.

You can find the discussion section notes I used previously here. You may find them useful for reviewing definitions and theorems, or even testing what you know.

### Handouts

• Coordinate systems, eigenvectors and eigenvalues — February 28
• Directional derivatives and the gradient — March 2
• The Hessian, Taylor's Theorem, Extrema Lagrange Multipliers and Quadratic Forms — March 10

### Assignment submission

In case you don't manage to submit your assignments to me in section, there will be envelopes in the assignment drop box outside my office on Thursdays until 5:00pm.
• ### What happened?

(Today is September 20.)
• January 5
1. span, and how to think about it in terms of remote-controlled robots
2. parametrizations of lines
• January 7
1. linear independence, and how to think about it in terms of robots too
2. more on parametrization of lines and planes (2.12, 2.13)
3. how to check if a vector is in a given span (2.2)
4. how to test if a set of vectors is linearly independent or not (3.1)
• January 12
1. proofs! — putting your mathematical ideas into words is really hard, but it pays off and you learn more
2. geometric interpretations of the dot product
• January 14
1. three interpretations of the matrix-vector product -- two technical definitions and the "intuitive" one involving the return of remote-controlled robots that links it all together!
• January 19
1. reviewing the three definitions/interpretations of matrix-vector product
2. using the dot-product definition of the matrix-vector product to set up an equation for determining the sets of vector orthogonal to a given set
3. solving such an equation using augmented matrices and row-reduced echelon form; free variables can take any value, whereas pivot variables are solved in terms of the free ones (and are expressed in terms of them in your solution)
4. a possibly more intuitive definition of null-space using remote-controlled robots
5. I ended by explaining how you can use the robot analogy to see how
• Given one solution to $$Ax = b$$ and the nullspace of $$A$$, this gives you all solutions to $$Ax = b$$ (draw a picture)
• Given two solutions to $$Ax = b$$, their difference is a solution to $$Ax = 0$$ and thus in the nullspace of $$A$$ (draw another picture)
• January 21
1. clarification of the difference between nullspace and column space
2. the axioms defining what a subspace are, plus examples of subspaces and non-subspaces in $$\mathbb{R}^2$$
3. verifying the subspace axioms for the solution set to $$2x - y - z = 0$$ in $$\mathbb{R}^3$$; then accomplishing the same result by expressing the solution set as a nullspace
4. true/false questions (9.5 to 9.10) of the textbook
• January 26
1. using the matrix appearing in question 12.10 of the book, I related the following concepts together:
• null space (same for $$A$$ and $$\operatorname{rref}(A)$$, because linear dependence relations are preserved by row operations)
• column space (different for $$A$$ and $$\operatorname{rref}(A)$$, but can still use the same choice of columns for a basis because linear relations are preserved by row operations)
• the rank/nullity theorem, stating that the sum of the dimensions of the two above spaces equals the width of $$A$$
• existence of solutions to $$Ax = b$$ (the set of $$b$$ for which there is a solution is the column space of $$A$$, and the dimension of this space is thus the number of pivots)
• uniqueness of solutions to $$Ax = b$$ (if at least one solution exists, the dimension of the number of solutions is the number of free columns of $$\operatorname{rref}(A))$$
2. examples were given of linear transformation from $$\mathbb{R}^2$$ to $$\mathbb{R}^2$$ (chapter 14), and how to represent these as matrices with respect to the standard basis.
• January 28
1. rotations, projections and reflections in the special cases of $$\mathbb{R}^2$$ and $$\mathbb{R}^3$$ -- the main things to remember are that
• linear transformations are determined by their values on the standard basis vectors
• when rotating around an axis, all entries of a matrix will be $$0$$, $$1$$, $$\cos(\theta)$$, $$\sin(\theta)$$ or $$-\sin(\theta)$$ -- you can determine which one it should be by rotating a physical object slightly and tracking the coordinates
• some linear transformations are more easily expressed by adding together others (for example, a reflection is the same as subtracting the original vector from twice the projected vector)
• February 16, 18
1. understanding domain, codomain and range
2. the vertical line test, used for determining whether a given subset of Euclidean space is the graph of a function
3. linear algebra review — how to compute orthogonal projections onto arbitrarily specified lines and planes in $$\mathbb{R}^3$$
4. shortcut methods for checking the definiteness of a $$2 \times 2$$ matrix just by looking at their trace and determinant (these conditions are easy for diagonal matrices; change of basis preserves these conditions)
5. how to think of open sets; how to check the defining conditions
• February 23
1. linear algebra review — $$A$$ symmetric implies $$I + A^2$$ is invertible
2. computing limits using the squeeze theorem
3. showing overall limits don't exist by approaching from various directions and exhibiting disagreeing limits
• February 25
1. evaluate limits by doing a linear change of coordinates (when taking the limit of an expression in $$x,y$$ as $$(x,y)$$ approaches $$(a,b)$$, do a variable substitution $$x'=x-a$$, $$y'=y-b$$ to rewrite this as the limit of an expression in $$x',y'$$ as $$(x',y')$$ approaches $$(0,0)$$) — the advantage of this is that it can simplify the algebra and/or make it clearer what steps to take
2. evaluate limits by doing a polar change of coordinates by writing $$x = r \cos(\theta), y = r \sin(\theta)$$; then $$(x,y)$$ approaching $$0$$ is equivalent to $$r$$ approaching $$0$$
3. computing partial derivatives using the definition in terms of limits
4. using partial derivatives for linear approximations

http://math.stanford.edu/~jlee/math51/
Jonathan Lee 