**Def**. (Group Representation) Let be a group, and let be a finite dimensional vector space. Then a group representation of is a homomorphism

where . The group structure on is given by function composition.

Note: for most of our applications, or, perhaps or . In the first case we will have

.

This topic requires us to better (and more quickly!) understand homomorphisms, vector spaces, invertible matrices / vector space isomorphisms, and some other topics, too. To get started on these, let’s consider the following exercises.

**Problem 1**. Let and be groups (each with their own group operations), and suppose is a group homomorphism.

(a) Prove that **the kernel** of , which is defined as

,

(where is the identity element in ) is a sub-group of . (Recall that we do not have to check that ‘s multiplication is associative for the proposed group , but, instead we need to check that , that this set is closed under products and that it is closed under taking inverses).

(b) Let . Explain why the function

(for all and is the identity matrix) is a representation of . Moreover, explain why this is called the **trivial representation**.

(c ) Prove that **the image** of , which is defined as

is a subgroup of .

**Problem 2**. Consider two matrices . Prove that the **trace** is invariant under conjugation; that is, prove that

.

Note: given the (relatively) small size of each matrix involved, this will likely be easiest to prove using an explicit formula for the inverse of a matrix.

It is not too difficult to show that the property of traces for matrices from Problem 2. also holds for traces of matrices — let us take this as a given for the time being.

Conjugating a given matrix, , by a second matrix, , to form the strange-looking product

is, from the point of view of Linear Algebra, a very natural process. The new matrix that results from this so-called conjugation is said to be **similar to **the original matrix , and it encodes much of the same information that the matrix does.

In Linear Algebra, we learn to view such a conjugation process as a “change of basis” process. This process works as follows: Initially, the invertible transformation is represented by matrix in terms of how transforms the standard basis vectors . In particular, we have that

Next, the invertible matrix is used to create a new (usually non-standard) basis for given by . We then build a matrix that encodes how the original transformation acts on by writing down how it transforms each new basis vector:

This, as it may seem, is a fairly significant topic in Linear Algebra, and there are details to be clarified (what, exactly, is a basis? how are the columns exactly expressed as columns of numbers?), but we can nonetheless focus on the central conclusion: *conjugation by invertible matrices is a natural process*.

In fact, since the trace of a matrix is invariant under this process, this reveals to us that the trace of a matrix is a more natural and important concept than one may first think (after all, its defined via a cute-looking formula, nothing particularly deep or scary seeming). If we want to regard the matrices and as “essentially the same” or as “encoding the same basic information,” then the trace becomes our friend, something that does not detect “artificial difference” we are more interested in overlooking.

**Problem 3**. Consider the set of invertible matrices equipped with the relation where two matrices are related

Prove that is an equivalence relation. Note that this “breaks up” the original set into different equivalence classes, each class containing matrices that represent the same isomorphism or symmetry of .

We now return to our abstract group, . If we have a representation , then we are able to use this representation to think of ‘s elements as matrices (isomorphically) transforming some vector space (in our case, or ).

Because is a group homomorphism, and because conjugating matrices in is a natural or fundamental process, this means that conjugating the original elements of should also be a natural process, too. In particular, given the **conjugate of ** **by** is the group element

.

**Def.** A **character** of a group is a function given by a representation according to the following rule:

For every element , the homomorphism converts it into some matrix, and then we obtain a real number by computing the trace of that matrix. Note that by Problem 2 (and its extension to matrices), a character takes the same value on an element as it does on a conjugated element . We rephrase this in Problems 4 and 5. below.

**Problem 4**. Let be a group. Define a relation on by defining

Prove that is an equivalence relation.

Note: the equivalence classes of are called the **conjugacy classes** of .

**Problem 5**. Explain why / prove that the following statement is true:

**Characters are constant on ‘s conjugacy classes**.

We still need to discuss many more aspects of representations and characters, but for now the table is somewhat set for some big ideas. In particular, we should be in a position to better understand and answer the following questions:

- What is a character table?
- What is a reducible representation?
- What is an irreducible representation?
- Exactly how do the entries of a character table encode the group under consideration?