In an *n*-dimensional space we may expand any vector
as a linear
combination of basis vectors

(80) |

For a general vector space, the coefficients

(81) |

and similarly

(82) |

The *scalar product* of two vectors is a complex number denoted by

(83) |

where we have used the standard linear-algebra notation. If we also require that

(84) |

then it follows that

(85) |

We also require that

(86) |

If the scalar product vanishes (and if neither vector in the product is the null vector) then the two vectors are orthogonal.

Generally the basis is chosen to be orthonormal, such that

(87) |

In this case, we can write the scalar product of two arbitrary vectors as

= | (88) | ||

= | |||

= |

This can also be written in vector notation as

(89) |

It is useful at this point to introduce Dirac's bra-ket notation. We define a ``bra'' as

(90) |

and a ``ket'' as

(91) |

A bra to the left of a ket implies a scalar product, so

(92) |

Sometimes in superficial treatments of Dirac notation, the symbol
is defined alternatively as

This is equivalent to the above definition if we make the connections and . This means that our basis vectors are

(94) |

Now we turn our attention to matrix representations of operators. An
operator
can be characterized by its effect on the basis
vectors. The action of
on a basis vector
yields some new vector
which can be expanded in terms of the
basis vectors so long as we have a complete basis set.

If we know the effect of on the basis vectors, then we know the effect of on any arbitrary vector because of the linearity of .

= | (96) | ||

= |

or

(97) |

This may be written in matrix notation as

(98) |

We can obtain the coefficients

= | (99) | ||

= | |||

= | A_{ij} |

since due to the orthonormality of the basis. In bra-ket notation, we may write

(100) |

where

It is easy to show that for a linear operator ,
the inner
product
for two general vectors (not
necessarily basis vectors)
and
is given by

(101) |

or in matrix notation

(102) |

By analogy to equation (93), we may generally write this inner product in the form

(103) |

Previously, we noted that
,
or
.
Thus we can see also that

(104) |

We now define the

(105) |

That is, we can make an operator act

(106) |

or, in bra-ket notation,

(107) |

If we pick and (i.e., if we pick two basis vectors), then we obtain

= | (108) | ||

= | |||

A_{ji}^{*} |
= |

But this is precisely the condition for the elements of a matrix and its adjoint! Thus the adjoint of the matrix representation of is the same as the matrix representation of .

This correspondence between operators and their matrix representations
goes quite far, although of course the specific matrix representation
depends on the choice of basis. For instance, we know from linear
algebra that if a matrix and its adjoint are the same, then the matrix
is called Hermitian. The same is true of the operators; if

(109) |

then is a Hermitian operator, and all of the special properties of Hermitian operators apply to or its matrix representation.