Other views of relations Expressing generality
Relation Algebra
The idea of relation algebra is to take relations themselves as objects and to reason about certain operations defined on them. If we think of binary relations in the set-theoretic way as sets of ordered pairs of items from some set X, it is natural to define a number of simple concepts:
- Intersection: The intersection R ∩ S of two relations R and S holds between any two objects x and y if and only if both Rxy and Sxy.
- Union: R ∪ S is the relation that relates x and y if and only if Rxy OR Sxy.
- Complement: R is the relation of not being related by R. As a set of pairs, it has to be defined relative to the universe X.
- Converse: R˘ reverses the direction of a relation. That is, for any x and y, R˘xy if and only if Ryx. Obviously, every relation has a converse. Note well the difference between converse and complement.
- Composition: R º S relates x and y if and only if SOMEz (Rxz AND Szy). This allows relations to be strung together so as to get from one thing to another by going through a chain of intermediates standing in various relationships to each other.
- Zero: The empty relation 0 never holds between any things. Note that for any relation R, the compound R ∩ R is bound to be 0.
- One: The full relation 1 holds everywhere: ALLx ALLy 1xy. Clearly R ∪ R is the same as 1, for any relation R.
- Identity: I is the relation =. That is, {⟨x, x⟩: x ∈ X} or the relation everything bears to itself and to nothing else. The key property of I is that for any relation R, both I º R and R º I are identical with R.
Many other concepts are easily defined in terms of these. To say that R is included in S, for example, usually written R ≤ S, is simply to say R ∩ S = R or equivalently R ∪ S = S. It is also useful to define a "power" notation to symbolise the result of iterating R a given number of times: R^{0} = I and for any k greater than or equal to zero, R^{k+1} = R^{k} º R.
Although the intended model of all this is the set of binary relations on a set, we can also consider abstract relation algebras, which are completely arbitrary structures on which are defined operations as above satisfying appropriate conditions or axioms which need not be detailed here. If you are interested, the references at the end of these notes [5, 8] will give you an entry point into the literature.
Relation algebra provides a beautifully concise notation for representing many properties of systems of relations, and by extension quite a lot of mathematics. Consider for example how to say that relation R is transitive. It could be written as in the last section, or using the vocabulary of relation algebra we may simply write it as R^{2} ≤ R. Again, to say that R is an equivalence relation is to say R = (R º R˘) ∪ I.
We shall not pursue this algebraic direction further in these notes, as it would take us too far from purely logical concerns. For present purposes it is sufficient to note that there is such an alternative approach to logic, which relates it to mainstream mathematics in ways that may not otherwise have been obvious.
Graph Theory
One notion which is not expressible in standard relation algebra, and not definable in first order logic, but which can be added in order to facilitate reasoning especially about processes, is the transitive closure of a relation. This is the result of iterating R indefinitely. We may think of it as the infinite union R ∪ R^{2} ∪ R^{3} ∪ R^{4} ... or as the smallest transitive relation which extends (i.e. includes) R. Even more useful is the reflexive transitive closure, usually written R*, which is the result of iterating zero or more times I ∪ R ∪ R^{2} ∪ R^{3} ... or equivalently the infinite intersection of all the quasi-orders that include R. Again equivalently, it is the smallest reflexive relation closed under the operation of composition with R. This notion of reachability by following the relation R is a central concern of another way of thinking about binary relations: graph theory Bondy.
: Example of an undirected graph with 8 vertices and 10 edges |
A graph in this sense, usually referred to as a directed graph is a set of nodes or vertices and a set of arcs or links between them. Each arc has a "head" and a "tail", these being the vertices it joins. A natural way to construe this is to think of an arc as consisting of the ordered pair: its head and its tail. It is obvious that a directed graph is then simply a binary relation, and any binary relation is a directed graph. In the case of a symmetric relation, the graph is undirected, since there is no preferred direction to the link between one vertex and another. The arcs of an undirected graph are usually called edges.
One nice thing about graphs is that we can draw them (or
at least the small ones). See graph. This gives us a
pictorial way of thinking about relations, allowing us to
bring our visual imagination to bear on them.
When we think of a relation as a graph, we concentrate on concepts such as:
- which parts of the graph are connected components (joined together by the arcs) and which are disconneted;
- the degree of the graph, or how many arcs (at most) are incident on any one vertex;
- which vertices are reachable from which others by following the arcs;
- the lengths of shortest paths from one vertex to another;
- the existence and size of cycles (paths from a vertex to itself) or cliques (sets of vertices all linked to each other).
Graph theory, again, is not the subject of our present study. We merely note it at this point to emphasise that there are many ways to think about relations, and that several branches of mathematics (logic, algebra, graph theory) come togther here.