It's important to note that throughout this document, I've been representing
matrices and vectors using row vectors. Some people prefer to represent them as
column vectors, which changes the order of multiplication. Here's the difference:
Row vectors (used throughout this document):
\[(A_x, A_y, A_z, 1) \cdot \begin{bmatrix}x_x & x_y & x_z & 0 \\ y_x & y_y & y_z & 0 \\ z_x & z_y & z_z & 0 \\ T_x & T_y & T_z & 1\end{bmatrix} =(A_x x_x + A_y y_x + A_z z_x + T_x,\ \ \ A_x x_y + A_y y_y + A_z z_y + T_y,\ \ \ A_x x_z + A_y y_z + A_z z_z + T_z,\ \ \ 1)\]
Column vectors (commonly used elsewhere):
\[\begin{bmatrix}x_x & y_x & z_x & T_x\\ x_y & y_y & z_y & T_y \\ x_z & y_z & z_z & T_z\\ 0 & 0 & 0 & 1\end{bmatrix} \cdot \begin{bmatrix} A_x \\ A_y \\ A_z \\ 1\end{bmatrix}=\begin{bmatrix}A_x x_x + A_y y_x + A_z z_x + T_x \\ A_x x_y + A_y y_y + A_z z_y + T_y \\ A_x x_z + A_y y_z + A_z z_z + T_z \\ 1\end{bmatrix}\]
Notice how A is now represented vertically, as well as that each column of the matrix is a vector.
Also, the order the matrices are placed in is flipped when multiplying. Applications such as Maya and
Houdini as well as DirectX, use row vectors. OpenGL uses column vectors as well as Unity3D and you'll
frequently see matrices and vectors written out in column vectors in mathematical
documents as well as on Wikipedia.
Row vectors vs column vectors can get especially confusing when you take into account how matrices
are implemented. Matrix implementations generally store a matrix as a single array of numbers.
The layout of the array will be either row major or column major. Here's an example of both:
memory layout = ( 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15 )
Row-major:
\[M = \begin{bmatrix}0 & 1 & 2 & 3 \\ 4 & 5 & 6 & 7\\ 8 & 9 & 10 & 11 \\ 12 & 13 & 14 & 15\end{bmatrix}\]
Column-major:
\[M = \begin{bmatrix}0 & 4 & 8 & 12\\ 1 & 5 & 9 & 13 \\ 2 & 6 & 10 & 14 \\ 3 & 7 & 11 & 15\end{bmatrix}\]
It can be confusing because row major and column major don't imply what format
the actual matrix is in (though, generally when when people say row major they
also imply row vectors, and column major implies column vectors). I ran into this
confusion while implementing these demos. The demos are
implemented with three.js which uses WebGL (the OpenGL implementation for browsers)
as its rendering engine. In the three.js documention for the
Matrix4 constructor
it says the matrix is initialized "with the supplied row-major values".
My first attempt to populate my matrix was as follows:
new THREE.Matrix4(xx, xy, xz, 0, yx, yy, yz, 0, zx, zy, zz, 0, Tx, Ty, Tz, 1);
This would make sense if the matrix was in row vector format:
\[\begin{bmatrix}x_x & x_y & x_z & 0 \\ y_x & y_y & y_z & 0 \\ z_x & z_y & z_z & 0 \\ T_x & T_y & T_z & 1\end{bmatrix}\]
But what slipped my mind was that OpenGL represents their matrices as column vectors, so the matrix
was actually transposed. The correct way to input my matrix into three.js is as follows (which is
actually column-major order for the way I'm representing my matrices):
new THREE.Matrix4(xx, yx, zx, Tx, xy, yy, zy, Ty, xz, yz, zz, Tz, 0, 0, 0, 1);
Which is the correct row-major format when the matrix is in column vector form: \[\begin{bmatrix}x_x & y_x & z_x & T_x \\ x_y & y_y & z_y & T_y \\ x_z & y_z & z_z & T_z \\ 0 & 0 & 0 & 1\end{bmatrix}\] When inputting your matrices manually, be aware of the difference between column-major and row-major input as well as column vectors and row vectors. Many texts and implementations assume you know which one they are using, so you often have to figure it out yourself. For more information take a look at these links: