Permutation Groups, Analysis of Boolean Functions, Finite Geometry, Coding Theory and Algebraic Graph Theory
Important mathematical concepts get reinvented many times. In my recent work with Yuval Filmus we explored objects that are called (in random ordering) Boolean degree 1 functions, Cameron-Liebler line classes, equitable partitions, completely regular strength 0 codes with covering radius 1, intriguing sets, perfect colorings, sets with dual width 1, tactical decompositions or tight sets — all depending on the context and who you ask. While the article with Yuval explains to some extend how these notions connect, a research article does not seem to be the right format to explain all concepts in sufficient detail. This post tries to amend this. It also prepares a future post which will elaborate my research with Yuval in more detail.
Algebraic Graph Theory
Let be a graph on vertices . The adjacency matrix of is the matrix indexed by the vertices of such that if x and y are adjacent and otherwise. In the following we stick to the assumption that is a regular graph, so every vertex has the same number of neighbors. This is equivalent to saying that the all-ones vector is an eigenvector of . The matrix is symmetric, so we find many other eigenspaces over the reals and this is what this post is about.
We will mostly work with the following three examples throughout this document.
Let be the graph with the k-element subsets of an n-element set as vertices. Two vertices are adjacent if they meet in elements. This is the Johnson graph.
Let be a prime power. Let be the graph with the k-spaces of an n-dimensional vector space over as its vertices. Two vertices are adjacent if they meet in a subspace of dimension . This is the Grassmann graph or q-Johnson graph.
Let be the graph with the elements of the symmetric group as vertices. We see as a permutation group acting naturally on . Two vertices x, y are adjacent if has exactly n-2 fix points. This graph is known as the transposition graph.
I do not know if this graph has some special name, but the graph on the same set of vertices, where two vertices are adjacent if has no fix points, is called derangement graph.
All eigenspaces of the first two examples were described in Philippe Delsarte’s PhD thesis in 1973. The third example is a Cayley graph on , which means that it is well-known that one can obtain its eigenvalues from the character table (so the representation theory) of .
An equitable partition is a partition of into parts such that there exist constants such that every vertex in has exactly neighbors in . Another word for equitable partition is perfect coloring. The central property of equitable partitions is the following: The eigenvalues of the matrix are also eigenvalues of the adjacency matrix . The existence of equitable partitions is a question that is investigated on its own for many interesting graphs, but I believe that the notion was invented as an easy way of determining eigenvalues of a graph.
Warning: The term equitable partition is also used by Janos Pach for a completely different concept which I will not go into. For example this paper talks about this different usage of the word.
Let the set of all k-sets containing 1 and the set of all the remaining k-sets. Then and . It is an easy exercise to verify that this is an equitable partition with , , , and . The eigenvalues of are (this is the degree of the graph) and — the second largest eigenvalue of .
Let be the family of all k-sets containing 1 and 2, the family of all k-sets containing 1 or 2, and the remaining k-sets. Again, we can easily verify that this is an equitable partition. This time , , , , , , , , . The eigenvalues of are then , and .
You see the pattern? So let’s jump ahead.
Let be the family of all k-sets meeting in elements. Then is an equitable partition. This partition yields directly all the eigenvalues of the Johnson graph which are for .
You might have noticed that the last example just partitioned the vertices of by their distance to a fixed vertex. This is a distance partition. This works because the is a distance-regular graph.
Let be the set of k-spaces in containing a fixed 1-space and let be the set of k-spaces not containing . Then it is easy to see that we again have an equitable partition. The graph is also distance-regular, so we can also choose the distance partition.
Let be the set of permutations of which map a to b and let be the set of remaining permutations. Again this defines an equitable partition of . Here , , , and .
In this case we cannot use the distance partition to obtain all the eigenvalues as the graph is not distance regular. If we instead fix a vertex and partition by the conjugacy class of , then again we obtain an equitable partition.
I will not define association schemes here (see Chris Godsil’s nice notes on this for more details), but this seems to be the appropriate time to remark that when you can fix any vertex x and partition the remaining vertices y according to their relation to x (by distance or conjugacy class or something similar) and this is an equitable partition, then this is usually an association scheme. This is algebraic structure investigated in many ways by algebraists and and used in many ways by combinatorialists.
Ordering the Eigenspaces
We saw that for we have a natural ordering of the eigenspaces: We already know that the all-ones vector is an eigenvector, so this is our trivial eigenspaces . If we take the equitable partition induced by partitioning into everything containing a fixed element p and everything not containing p, then we obtain a second eigenvalue with its corresponding eigenspace . Let denote the indicator vector of , that is is 1 if and 0 otherwise. Indeed, is spanned by . Next is spanned by the indicator vector of pairs of elements , by triples and so on.
The graph behaves in a very similar way, just that p is a 1-space, a 2-space and so on: the eigenspaces are spanned by the indicator vectors of t-spaces. This is a general property for many highly symmetric graphs such as distance-regular graphs.
This no longer works for for various reasons, but it is still natural to denote the non-trivial eigenspace corresponding to the equitable partition in Example 8 by (or by some union of eigenspaces if it yields a partition into more parts). And there is a standard way of obtaining a partial ordering of the eigenspaces, starting with , but I that will not concern us too much.
The Strength and Covering Radius of a Set
Let be a subset of vertices our graph with some natural partial ordering of eigenspaces. For simplicity assume that is distance regular and has a proper ordering of eigenspaces (a so-called -polynomial association scheme). We write f for the characteristic vector of , so if and otherwise. Clearly, we can write as a sum of eigenvectors , maybe some of these equal to 0. The strength of is the largest s such that for all (with if ). The set is a completely regular if it is an equitable partition. The covering radius of is the number of non-trivial eigenvalues in the equitable partition.
We limit ourselves to distance-regular graphs which makes things easier. For example for all these definition would be slightly more complicated (that is why I am not presenting them).
Our Example 4 has has strength 0 and covering radius 1. Our Example 5 has strength 0 and covering radius 2.
It is well-known that a perfect code of is an example for a completely regular set. In fact, Delsarte defined the notion of completely regular as a generalization of perfect codes. Perfect codes are very rare, for example there are only three perfect codes in . See Natalia Silberstein’s Master thesis for a nice argument for this.
Analysis of Boolean Functions
Up to now, we mostly thought of our set as a vector with entries 0 and 1. Alternatively, we can think of f as a function from to . That is a Boolean function. Traditionally, the term Analysis of Boolean Functions is only used if one works in the hypercube, so , but in the last years there is some interest in transferring results to other highly symmetric association schemes (our introduction here provides a few references).
Warning: We think about the reals. In some areas of Boolean analysis — in particular bent functions — one works over the binary field.
Again, Degree and Eigenspaces
Recall that in the Boolean function is defined by if and only if . Then we can write any function as a polynomial in the variables . In other words, we have . Now we can ask for all the Boolean functions of degree 1. So we have and the image of f is in . But recall that above we learned that the span . Hence, Boolean degree 1 functions are just 0-1 vectors in the subspace .
Whenever we have some form of lattice where something can take a role similar to the elements of , we can define Boolean degree t functions (or more, generally, degree s functions) in similar way. For these are the 1-spaces. For these are all the maps from a to b.
When using linear algebra methods, then one often investigates the decomposition of f into eigenvectors, that is as above. In the analysis of Boolean functions it is common denote this as Fourier transform of .
Different points of view are always helpful for approaching a research problem. In the following two examples for this.
Consider the hypercube, that is the graph of all subsets of . What are the Boolean degree 1 functions? Just . Seen as a degree 1 polynomial this is an easy exercise. Seen more generically as a 0-1 vector in , this is slightly harder to see.
Now what non-trivial properties has Boolean degree 1 function f on . A very straightforward linear algebra argument shows you that divides the size of f. When just writing f as a degree 1 polynomial I do not know how to see this easily — except of course by emulating the linear algebra argument.
Warning: Boolean degree 1 functions are not a strict subset of equitable partitions. It can be that the span more than one eigenspace. Also if the characteristic vector of a set lies in more than one non-trivial eigenspace, then there is no guarantee that it is an equitable partition.
Permutation Groups and Cameron-Liebler sets
Next let us explain the word Cameron-Liebler line class. Consider again canonically acting on . Any subgroup H of partitions the set of k-sets on into orbits. In particular, we have some number A of orbits on 1-sets and some number B of orbits on 2-sets. It is easy to see that . But when do we have equality? It turns out that there are only two cases: either with or is the stabiliser of a 1-set and . You might realize that the second example is the equitable partition from Example 4. That describes a Boolean degree 1 function. Hence, asking for equality asks for Boolean degree 1 function which are induced by subgroups of .
In 1982 Cameron and Liebler asked the same question for . Here A is the number of orbits of the projective general linear group on 1-spaces and B the number of orbits on 2-spaces. Again, when do we have equality in ? There are a few obvious examples and Cameron and Liebler conjectured that these are all examples (more on this in the announced future post). Again, any example corresponds to a Boolean degree 1 function. There are surprisingly many non-obvious example for Boolean degree 1 functions and in the research in finite geometry around this topic, the term Cameron-Liebler line class (or some variation of it) is well-established.
In general, taking subgroup of a collineation group of graph is a natural way of obtaining equitable partitions. The keyword here is tactical decomposition (of (sub)group orbits). The process is very straightforward: suppose that you have a vertex transitive graph G with automorphism group Aut(G). Take any subgroup H of Aut(G). Then the orbits of H on vertices define an equitable partition. This is a natural of way looking for example for equitable partitions with special described properties. Indeed, the many known examples for non-trivial Boolean degree 1 functions in were found by combining unions of orbits for a suitable subgroup H. This is called finding a tactical decomposition. There are many nice text on tactical decompositions, for instance this one by Anton and Dieter Betten.
For the sake of completeness, I want to mention that sometimes all the objects above are defined by using design orthogonality. To my knowledge this concept goes (again!) back to Delsarte. This definition is very specialized. For example for , n a multiple of k, and , it goes as follows: Let g be the characteristic functions of into k-sets. A (Boolean) function f has degree 1 if and only if it meets in a constant number of elements (for h being an element of acting canonically).
Let be the family of all vertices of which contain 1. If k divides n, then each partition of contains exactly one element of .
Finite Geometry: Tight Sets, Intriguing Sets, m-Ovoids …
Cameron-Liebler line class is not the only word for the same object in finite geometry. The collinearity graph G of a finite classical polar space or a generalized quadrangle is strongly regular which implies that the adjacency matrix has only three eigenspace, so we can decompose as . A coclique of G which satisfies the ratio bound (also called Hoffman’s bound) with equality (this is my obligatory reference to my WordPress username) lies in (notice that is is debatable what the natural ordering of the eigenspaces is). This is an equitable partition and an equitable partition in is usually called m-ovoid. Similarly, an equitable partition in is called x-tight set. The general umbrella term for an equitable partition is intriguing set. De Bruyn and Suzuki define an intriguing set more generally as an equitable partition with two parts for any regular graph, so any 0-1 vector in .
Some Concluding Words
Originally, I planned on just writing a blog post about my paper with Yuval. Then I met Dirk Frettloh during Kolkom 2018 recently. He gave a talk on equitable partitions for some small graphs. There I claimed to him that my article mentions some of the words which are used in specific settings to describe equitable partitions. Sadly, I realized still during the conference (looking at our article again) that we did not even mention the word equitable partition. This blog post is aimed at amending this situation. Also I am sure that many of the things above were rediscovered and investigated independently a few more times.
It is (at least for me) exhausting to cover all these terms and translate them correctly. I apologize for any imprecision. Please let me know in the comments. This is a blog post, so that I can easily fix things.
Acknowledgements: I thank John Bamberg for suggesting that I add the term “tactical decomposition”. Also thanks to Karen Meagher for pointing out that Example 3 is the transposition graph.