Addition is always commutative, because the elements are added elementwise. For instance:
| 3 2 |....| 5 47 |....| 3+2 2+47 |
| 1 4 | + | 3 33 | = | 1+3 4+33 |
| 3 3 |....| 12 1 |....| 3+12 3+1 |
Now, if you were to revers the order of the multiplication, the only difference would be that the top right element would be 47+2 instead of 2+47, and in the bottom left corner 12+3 instead of 3+12... ...but these are the same, because addition of scalars (the kind of numbers we're used to) is commutative; a+b = b+a.
However, this is not true for matrix multiplication. For one thing, if A is a matrix with 1 row and n columns, and B is a matrix with n rows and 1 columb, AB will be have only one row and one column, whereas BA is an n-by-n matrix.
However, even if the matrices are both square matrices of the same dimension, matrix multiplication is not generally commutative. I don't know of any simple rule that tells you if two particular matrices will commute or not, but here are some examples of cases where they do commute:
* Every invertible matrix commutes with its inverse: A * A^-1 = A^-1 * A
* Every matrix commutes with itself: AAB = ... ...well, AAB, obviously.
* The identity matrix commutes with all matrices of the same dimension as itself. So, for instance, if I i the n-by-n identity matrix, and A is any other n-by-n matrix, then IA = AI
* The same goes for the zero matrix. 0A = A0 = 0
* Diagonal matrices [that is, square matrices where all the elements that are not on the diagonal that goes from the top left corner to the bottom right corner are equal to zero] of the same dimension commute. For instance:
| a 0 0 |....| d 0 0|....| d 0 0|...| a 0 0 |
| 0 b 0 | * | 0 e 0| = | 0 e 0| * | 0 b 0 |
| 0 0 c |....| 0 0 f |....| 0 0 f |...| 0 0 c |
Both matrix addition and matrix multiplication are, however, associative.
Matrix addition is associative because the addition is done elementwise, and addition of scalars is associative.
Showing that matrix multiplication is associative is more complicated, but quite possible. I suggest you try proving it yourself as a fun challenge, or you can see a proof here: http://mathrefresher.blogspot.com/2007/04/properties-of-matrix-multiplication.html
A zero matrix is usually defined as a square matrix of which every element is 0. It's often written as a boldface 0 with a subscript n, with n denoting that it is the n-by-n zero matrix. For instance,
.......| 0 0 0 |
0ā = | 0 0 0 |
.......| 0 0 0 |
is the 3-by-3 zero matrix.
It fills much the same function as the number 0 does in scalar multiplication: 0A = A0 = 0 [assuming that A has the right dimensions to be multiplied by this particular 0 matrix, that is; you can't multiply a 5-by-5 matrix by a 3-by-3 zero matrix, for instance]
The identity matrix is a square matrix with ones in the diagonal and zero in the rest of the matrix. It's often denoted by an I followed by a subscript n, similar to the zero matrix. For instance,
......| 1 0 0 |
Iā = | 0 1 0 |
......| 0 0 1 |
is the 3-by-3 identity matrix.
It fills much the same function as the number 1 does in scalar multiplication: IA = AI for all matrices A [assuming, again, that A has the right dimensions to be multiplied by this identity matrix]
I hope that helps!