Vectors and the Sum Table Product

an Algebra Factoid


| NN Top | Introduction | Newton's Method | Fuzzy Logic NNs | & training rule | Twist-and-Bend | Space Representing NNs | Dual Quaternions | Algebroids | Robots |

| The Sum Table | "Products" for Vectors |

I like things to make sense, but I'm not that smart. So I want things to be more obvious, not less. Assertions should make sense, and I should be able to see why. Notations should not only say something true but make those true things obvious, rather than hide the underlying sense in some arbitrary form of concealment. In this spirit here is a notation that makes some non-obvious definitions become more obvious.

Sum Table Notation

Let this notation
\(\large \sum \cdot\ \)
 \(\Large \alpha\) 
indicate the sum of all the components \(\alpha\), each of which are products of column label times row label.

A column-by-row arrangement makes obvious what goes into all the products forming the sum, as in:

\((a+b)(x+y) =\)
  \(\large \sum \cdot\)      x    y
   a \(ax\)  \(ay\) 
   b \(bx\)  \(by\) 
Isn't it nice? Labelling the columns and rows with factors; the sum is the columns multiplied by (distributed over) over the rows, or equivalently, vice versa.

\(a(x+y)+ b(x+y) = ax+ay+bx+by = x(a+b)+y(a+b)\)

The value here is increased Obviousness. Information which is immaterial, namely whether columns are distributed over rows or rows distributed over columns, does not need to be specified, nor would it help if it were specified, and therefore here is unspecified; this is a benefit of Sum Table Notation. So far, so obvious, but I promise this will suddenly become Useful. I think Obvious is a dramatic advantage, actually, like H Notation, because it helps us to think, which is to say, to see that something is obvious. Respect Tautology, and Simplify.

"Products" of Vectors

Do you hate dot and cross products of vectors? I do. Do you hate determinants? I do. We are going to celebrate, together, in a minute.

This "geometric algebra" interpretation of dot and cross products makes them at once obvious, loveable, and forgettable. Forgettable because they are not really a thing, but each becomes an obvious part of a thing that you already know, so you won't even have to think about them any more. Loveable because they do cool stuff for free. Obvious because, well, you will see.

But first (this was Hamilton's great Quaternion idea): let's extend the imaginary number concept \(i=\sqrt{-1}\) to define three imaginary numbers. Like \(i\) is an imaginary NUMBER, \(i,j,k\) is an imaginary SPACE. (The Geometric Algebra idea is the same, but it factors \(i,j,k\) into a shared imaginary part \(i\) and three orthogonal "directed line segments" \(\sigma_1, \sigma_2, \sigma_3\), so that \(i*(\sigma_1+\sigma_2+\sigma_3) = i\sigma_1+i\sigma_2+i\sigma_3\) in GA.) Let's stick with quaternions for a minute.

Now, because Hamilton's three imaginary numbers are imaginary, we have:

\(i^2≜j^2≜k^2≜-1\).
So far so good. Now we get the cool part: we define their "products".

\(ij≜k,\ \ jk≜i,\ \ ki≜j,\)

The intuition here is to capture the idea of a quarter turn of a screw: "Multiply \(i\) with \(j\)" means "by turning from \(i\) to \(j\), you move in the direction \(k\)." A reverse turn backs out the screw and produces a minus sign. Put a screw on each face that touches one corner of a cube, then \(i,j,k\) define all the directions, and all the turns that turn a screw in each of the directions. You can think of your right-hand forefinger as \(i\), middle finger as \(j\), and thumb as \(k\), put them at right angles, then turn your forefinger toward your middle finger, and the idea is that that direction of turn moves in the direction of your thumb. It's a way of thinking of rotations as multiplication.

Was that mind-blowing? I promise, we have already done the hard part.

Now we bring out some other rules, the basic tools of tri-imaginary multiplication:

Postmultiply \(ij=k\) on both sides by \(k\), so we have \(ijk=kk=-1\).

Squaring both sides of \(i=jk\), premultiplying by \(j\) and postmultiplying by \(i\), we have

\(ji^2i = j(jk)^2i = jjkjki = (jj)(kj)(ki) = (-1)(kj)(j) = -k(jj) = k \) and
\(ji(ii)= -ji \) so
\(-ji = k \)

Similarly, \(ji=-k, kj=-i, ik=-j\).

Next we can scale each one to get arbitrary vectors in this imaginary space. Define:

\(p≜\ (i\ p_1+j\ p_2+k\ p_3)\),    and
\(q≜\ (i\ q_1+j\ q_2+k\ q_3)\)
Then distributing \(p\) over \(q\) or vice versa, we get:
\(pq\ \ =\ \ \)
\(\large \sum \cdot\ \)     \(i\ q_1\)       \(j\ q_2\)       \(k\ q_3\)  
\(i\ p_1\)   \(i^2\ p_1 q_1\)   \(ij\ p_1 q_2\)   \(ik\ p_1 q_3\)
\(j\ p_2\)   \(ji\ p_2 q_1\)   \(j^2\ p_2 q_2\)   \(jk\ p_2 q_3\)
\(k\ p_3\)   \(ki\ p_3 q_1\)   \(kj\ p_3 q_2\)   \(k^2\ p_3 q_3\)
   \(=\)   
\(\large \sum \cdot\ \)     \(i\ q_1\)       \(j\ q_2\)       \(k\ q_3\)  
\(i\ p_1\)   \(\ -p_1 q_1\)   \(\ \ \ k\ p_1 q_2\)   \(-j\ p_1 q_3\)
\(j\ p_2\)   \(-k\ p_2 q_1\)   \(\ -p_2 q_2\)   \(\ \ \ i\ p_2 q_3\)
\(k\ p_3\)   \(\ \ \ j\ p_3 q_1\)   \(-i\ p_3 q_2\)   \(\ -p_3 q_3\)
It seems like a mess but you can go through it one cell at a time, confirming that every cell in the sum table is the product of the row label times the column label. Yes? Yes! Now you know everything in this table and how it got there and why, because \(q\), the sum of the row labels, times \(p\), the sum of the column labels, is the sum of all the cells in the table. It is all exactly like you expect. Check and make sure!

So yes, that's the whole idea, in reality.

I want to call this the "Actual" or "Natural" or "Algebraic" product. Because when you multiply \(pq\) that's what you actually get: all those cells in the whole table, added up. Does anyone have a better name for this kind of a product? I haven't heard one. All the rows, times all the columns. Because that's what a product actually is.

Next: fun!

Notice that everything in the Sum Table for \(pq\) has an \(i,\ j\), or \(k\) in it except the diagonal parts where they all got cancelled because \(i^2=j^2=k^2=-1\). (That's called "contraction", going from what seems like two vectors to a real number.) We will give those a special name, that real numbers are scalars (which you use to scale up or down the different components of a vector) and scalars summed together are still a scalar, so add them all up and we call it the scalar product (but should be the "scalar part of the actual product"), and they also invented two more names which we call the (-)dot product. ("\(-\)" because of the squared imaginaries, but you can remove the \(-\) and instead call that the dot product, as they do when they define \(p\cdot q\) using \(pq^*\). Close enough.) Conveniently it's also the same as the Pythagorean sum of not just \(a^2+b^2=r^2\) for the diagonal of a rectangle in a plane but \(a^2+b^2+c^2=r^2\) for diagonals across a box in 3D, that's good to remember (and you can keep going for diagonals of boxes of any number of dimensions).

My point is, dot product is not a separate thing, it's just the diagonal of this table, which are all the cells without any imaginaries. That's what the so-called dot product a.k.a. scalar product a.k.a. inner product is. Inner because it is the innermost diagonal in the sum table. Then there will be the "outer" parts which are all the other cells in the table not on the main diagonal; guess what the "outer product" means?!

Yes. The cells with the imaginary-vector parts \(i\),\(j\), or \(k\) in them, are all the cells off the diagonal, so we call their sum the outer product, or the vector product, or the cross product. Confusing that a sum of a bunch of things together is called a cross product (because having to memorize its formula makes everyone cross, and if they are trying to have a good attitude about it then it still makes their eyes cross). Confusing that they call it a vector product (because the \(i\)'s,\(j\)'s and \(k\)'s are vectors in our 3D imaginary space), but it should be called the "vector part of the actual product". Similarly, the "outer product" is just the outer (off-diagonal) part of the actual product. Well, but that would only make sense.

Oh here's another one. They like to say, the cross product \(p\times q ≜ i\begin{vmatrix} p_2 & p_3 \\ q_2 & q_3\end{vmatrix} + j\begin{vmatrix} p_1 & p_3 \\ q_1 & q_3\end{vmatrix} + k\begin{vmatrix} p_1 & p_2 \\ q_1 & q_2\end{vmatrix} \) and further define the "determinant" \( \begin{vmatrix} w & x \\ y & z \end{vmatrix} ≜ wz-xy\) which is pure notation-for-notation's-sake, obfuscation, and has ended many a mathematical career. Noone really explains what "determinant" means, geometrically or otherwise; we are forced to memorize this completely arbitrary set of indexes and imaginary numbers and ugh. It's horrible, unfriendly, unnecessary, and such a pedagogical mistake.

When you realize the whole rigmarole is nothing more than the mere fact that the imaginaries all cancel on the diagonal -- leaving scalars -- and don't, off the diagonal -- leaving vector parts -- and then that the scalar and vector parts of the Actual Product of two vectors might could be treated separately, then these unnecessary definitions go away, the obfuscation goes away, the intrinsic ideas become obvious and trivial, and we are done, ready to go on to the next.

Alternatively, without the Sum Table and the simple observation of cancelled imaginaries on the diagonal, the student has no idea how to make sense of this determinant term, after a whole career in math with excellent geometrical and algebraic intuitions of everything, now I have to know why it's an \(i\) and an index 2 and an index 3 going together and why the others have those other patterns that they do, and I just have to memorize these pointless, meaningless symbols and terms and formulas, and just hope I pass the class but certainly have learned my lesson to avoid things like this in the future.

That's what determinants and cross products do to mortal finite-brained students. It is irresponsible, you can make a case it is reprehensible, to have dumped those unnecessary burdens on us all, these historical names, their memorization burden, their mystery reasons. So, let's do better, just think of the Sum Table and the cancelled imaginaries on the diagonal, and that separates the scalar parts from the vector parts.

Okay, rant complete. When I figured this out, I had to get it off my chest. Thank you for your patience. I hope this was helpful to you and that you enjoyed yourself.

Your thoughts?
(will not be shared or abused)
Comment:
                                          Feedback is welcome.
Copyright © 2000 - 2023 Thomas C. Veatch. All rights reserved.
Created: September 25, 2023