Six Spectral Bounds

I spent the last few days in vain using several spectral arguments to bound the size of certain intersection problems. For instance what is the largest set of vectors in $ {\{ 0, 1 \}^4}$ pairwise at Hamming distance at most $ {2}$ (a problem solved by Kleitman, recently investigated by Huang, Klurman and Pohoata). Here the answer is $ {5}$ and $ {0000}$, $ {0001}$, $ {0010}$, $ {0100}$, $ {1000}$ would be such an example. Or the largest set of vectors in $ {\{ 0, 1 \}^7}$ pairwise at distance $ {2}$ or $ {6}$. Here the answer is $ {8}$ and an example is $ {0000001}$, $ {0000010}$, $ {0000100}$, $ {0001000}$, $ {0010000}$, $ {0100000}$, $ {1000000}$, $ {1111111}$. This is a problem which I recently investigated together with Hajime Tanaka, extending work by Frankl and others.

Usually, this playing around does not lead to anything. But this time …. It is actually the same. However, I did one useful thing which is the following: Generously counting, I do know five different easy spectral arguments which can be used to investigate these questions. This blog post presents these methods for the two problems mentioned above.

1. The Hypercube $ {\{ 0, 1 \}^4}$

Here is a picture of the $ {4}$-dimensional hypercube (missing a few edges, but I always feel that it is nice to have some picture in a post):

hypercube

We want to show that $ {0000}$, $ {0001}$, $ {0010}$, $ {0100}$, $ {1000}$ is an example for a largest family $ {Y}$ of vertices in this graph such that all vertices in $ {Y}$ have at most distance $ {2}$. I guess that one can easily see this by playing around as there are only $ {16}$ vertices, but we want to use (1) more generic methods which (2) use the spectrum of the adjacency matrix of the hypercube in some way.

Before we start, we define the following $ {2^4\times 2^4}$-matrices, all of them indexed by the elements of $ {\{ 0, 1 \}^4}$:

  • $ {A_0}$ is the identity matrix.
  • $ {A_1}$ has a $ {1}$ at position $ {(x, y)}$ if $ {x}$ and $ {y}$ have Hamming distance $ {1}$, and a $ {0}$ otherwise.
  • $ {A_2}$ has a $ {1}$ at position $ {(x, y)}$ if $ {x}$ and $ {y}$ have Hamming distance $ {2}$, and a $ {0}$ otherwise.
  • Similarly, we define $ {A_3}$ and $ {A_4}$: $ {A_j}$ has a $ {1}$ at position $ {(x, y)}$ if $ {x}$ and $ {y}$ have Hamming distance $ {j}$, and a $ {0}$ otherwise.

All of these matrices commute, so we know from our linear algebra lecture that we can diagonalize them simultanously. It turns out that the matrices have five common eigenspaces $ {V_0, V_1, \ldots, V_4}$ (of course individual matrices have less eigenspaces, for instance the only eigenspace of $ {A_0}$ is just $ {V_0+V_1+V_2+V_3+V_4}$, but we can write the eigenspaces as sums of $ {V_i}$s). We end up with the following table where the columns are indexed by the five $ {A_i}$s, while the columns are indexed by the $ {V_i}$s:

$ \displaystyle P = \begin{pmatrix}1 & 4 & 6 & 4 & 1\\ 1 & 2 & 0 & -2 & -1\\ 1 & 0 & -2 & 0 & 1\\ 1 & -2 & 0 & 2 & -1\\ 1 & -4 & 6 & -4 & 1\end{pmatrix} $

From here we see that $ {A_1}$ has the eigenvalues $ {4}$, $ {2}$, $ {0}$, $ {-2}$, and $ {-4}$. In particular, an eigenvector $ {v}$ of $ {A_1}$ in $ {V_1}$ satisfies $ {A_1 v = 2v}$, while we have $ {A_2 v = 0}$ and $ {A_4 v = -1}$.

We will also need the orthogonal projection $ {E_i}$ onto the eigenspace $ {V_i}$. Luckily for us, the hypercubes have very special eigenspaces and we have that $ {E_i = \frac{1}{2^4} \sum_{j=0}^4 P_{ji} A_j}$. I also have to mention that the dimension of $ {V_i}$ (or the rank of $ {E_i}$) is equal to $ {\binom{4}{i}}$.

The above is a short summary of some basic properties of so-called association schemes.

For the rest of this section let $ {Y}$ be a set of vertices of $ {\{0,1\}^4}$ such that all vertices are pairwise at (Hamming) distance $ {1}$ or $ {2}$.

1.1. Proof 1: The Inertia Bound

Recall Cauchy’s interlacing theorem:

Theorem 1 (Cauchy’s interlacing theorem) Let $ {A}$ be an $ {n \times n}$ symmetric matrix with eigenvalues $ {\lambda_1 \geq \lambda_2 \geq \ldots \geq \lambda_n}$. Let $ {B}$ be an $ {m \times m}$ principal submatrix of $ {A}$ with eigenvalues $ {\mu_1 \geq \mu_2 \geq \ldots \geq \mu_m}$. Then we have $ {\lambda_i \geq \mu_i \geq \lambda_{n-m+i}}$ for all $ {i}$.

We consider the matrix $ {A_{IB} := A_3-A_4}$. (Nothing is special about $ A_3 – A_4$. There many other choices for $ A_{IB}$ and all the other matrices in this post, I just choose something that works for each particular argument.) Then the matrix $ {P}$ tells us that the eigenvalues of $ {A_{IB}}$ are $ {3}$, $ {-1}$, $ {-1}$, $ {3}$, $ {-5}$ (ordered by the order of the $ {V_i}$s). The dimension of $ {V_0}$ is $ {1}$ and the dimension of $ {V_4}$ is $ {4}$, so the multiplicity of $ {3}$ is also $ {5}$. We conclude that $ {A_{IB}}$ has only $ {5}$ non-negative eigenvalues.

Next we look at the principal submatrix $ {B}$ of $ {A_{IB}}$ which is indexed by $ {Y}$. We chose $ {A_{IB}}$ such that the submatrix indexed by $ {B}$ is the all-zeroes matrix. Hence, the $ {|Y|}$ eigenvalues of $ {B}$ are all $ {0}$. By Cauchy’s interlacing theorem, we have that $ {(A_{IB})_{|Y|} \geq 0}$. Hence, $ {|Y| \leq 5}$.

This bound is called inertia bound or Cvetkovic’s bound, see my previous post.

1.2. Proof 2: Another Multiplicity Bound

Again, we use a linear combination of matrices, but this time we use the $ {E_i}$s. Let $ {A_{GB} = 2^4(E_3 – E_4)}$. Notice that $ {A_{GB}}$ has rank $ {5}$ as $ {E_3}$ has rank $ {4}$ ($ {=\dim(V_3)}$) and $ {E_4}$ has rank $ {1}$ ($ {=\dim(V_4)}$). We can express $ {A_{GB}}$ as a linear combination of $ {A_i}$s: $ {A_{GB} = 3A_0 – A_1 – A_2 + 3A_3 – 5A_4}$. Let $ {B}$ be the principal submatrix of $ {A_{GB}}$ indexed by $ {Y}$. Then $ {B = 4I – J}$. This matrix has full rank unless $ {Y}$ has size $ {4}$. In that case, $ {B}$ has rank $ {|Y|-1}$. As $ {\text{rank}(B) \leq \text{rank}(A_{GB}) = 5}$, we conclude that $ {|Y| \leq 5}$.

1.3. Proof 3: The Ratio Bound

To honor the name of my blog, a proof using the ratio bound (also called Hoffman’s bound). We will skip details here as it is a bit off-topic.

Consider the matrix $ {A_{RB} := A_3+A_4}$. The eigenvalues of $ {A_{RB}}$ are $ {5}$, $ {-3}$, $ {1}$, $ {1}$, and $ {-3}$. The ratio bound says that

$ \displaystyle |Y| \leq \frac{16}{1 – 5/(-3)} = 6. $

Furthermore, a more elaborate argument shows that equality occurs only if each element of $ {Y}$ has distance $ {1}$ to $ {2}$ vertices of $ {Y}$ and distance $ {2}$ to the remaining $ {3}$ vertices of $ {Y}$. From this it is an easy puzzle to see that $ {|Y|=6}$ does not occur, so $ {|Y| \leq 5}$.

2. The Hypercube $ {\{ 0, 1 \}^7}$

Frankl investigated the maximal number of pairwise non-orthogonal vectors in $ {\{ -1, 1 \}^n}$. Of course these are just vectors in $ {\{ 0, 1 \}^n}$ which have pairwise Hamming distances different to $ {n/2}$. Frankl noticed that one can reduce the problem to a different question on $ {\{ 0, 1 \}^{n-1}}$. For $ {n=8}$, we ask for a set $ {Y}$ in $ {\{ 0, 1 \}^7}$ with vertices pairwise at distance $ {2}$ or $ {6}$. It is easy to find such a set of size $ {8}$:

$ 1000000, 0100000, 0010000, 0001000, 0000100, 0000010, 0000001, 1111111.$

We will show that this is the best possible. Here is the analog of our matrix $ {P}$ for $ {n=7}$:

$ \displaystyle \tilde{P} = \begin{pmatrix}1 & 7 & 21 & 35 & 35 & 21 & 7 & 1\\ 1 & 5 & 9 & 5 & -5 & -9 & -5 & -1\\ 1 & 3 & 1 & -5 & -5 & 1 & 3 & 1\\ 1 & 1 & -3 & -3 & 3 & 3 & -1 & -1\\ 1 & -1 & -3 & 3 & 3 & -3 & -1 & 1\\ 1 & -3 & 1 & 5 & -5 & -1 & 3 & -1\\ 1 & -5 & 9 & -5 & -5 & 9 & -5 & 1\\ 1 & -7 & 21 & -35 & 35 & -21 & 7 & -1\end{pmatrix}. $

Again, we will need the orthogonal projections $ {E_i}$ onto the eigenspaces of $ {V_i}$. The relationship is as before, so this time we have $ {E_i = \frac{1}{2^7} \sum_{j=0}^7 \tilde{P}_{ji} A_j}$. This time the dimension of $ {V_i}$ (or the rank of $ {E_i}$) is equal to $ {\binom{7}{i}}$.

2.1. Proof 4: The Ration Bound Again

First, let us mention that we can use the ratio bound again. Our $ {Y}$ is an independent set of the graph with adjacency matrix $ {A_{RB2} := A_1+A_3+A_4+A_5+A_7}$. The eigenvalues of $ {A_{RB2}}$ are $ {\tilde{P}_{i1}+\tilde{P}_{i3}+\tilde{P}_{i4}+\tilde{P}_{i5}+\tilde{P}_{i7}}$. If we apply the ratio bound now, then we get a terrible bound. But we are allowed to use weighted adjacency matrices! So let’s take $ {A_{RB2}’ := A_1+A_3+\frac{8}{5} A_4 + A_5+A_7}$. This is a matrix with eigenvalues $ {\tilde{P}_{i1}+\tilde{P}_{i3}+ \frac{8}{5} \tilde{P}_{i4}+\tilde{P}_{i5}+\tilde{P}_{i7}}$. The largest eigenvalue is $ {120}$, the smallest is $ {-8}$ and we have $ {128}$ vertices, so the ratio bound yields

$ \displaystyle |Y| \leq \frac{128}{1 – 120/(-8)} = 8. $

2.2. Proof 5: Yet Another Multiplicity Bound

Now we consider the following matrix: $ {A_{pMB} := 2^7(E_0 + E_1)/4}$. We can rewrite this using the $ {A_i}$’s as

$ \displaystyle A_{pMB} = 3 A_0 + \frac{5}{2} A_1 + 2A_2 + \frac{3}{2} A_3 + A_4 + \frac{1}{2} A_5 + 0 A_6 – \frac{1}{2} A_7. $

Consider the submatrix $ {B}$ of $ {A_{pMB}}$ indexed by $ {Y}$. As the distances $ {1}$, $ {3}$, $ {4}$, $ {5}$ do not occur in $ {Y}$, $ {B}$ is also a submatrix of

$ \displaystyle 3 A_0 + 2 A_2 + 0 A_6 $

which is the identity matrix modulo $ {2}$. Hence, $ {B}$ is a submatrix of $ {A}$ of rank $ {|Y|}$. But we know (as in Proof 2) that $ {A_{pMB}}$ has at most rank $ {\text{rank}(E_0)+\text{rank}(E_1) = \dim(V_0) + \dim(V_1) = 8}$. Hence, $ {|Y| \leq 8}$.

2.3. Proof 6: One Last Multiplicity Bound

This proof does not really count as it will only show that $ {|Y| \leq 9}$. I included it because it works nicely in other cases.

If we limit ourselves to elements of even weight, then one can show that the $ {2}$-rank of $ {A_{MB3} := A_2 + A_6}$ is $ {8}$ (if we do not restrict ourselves to even weight codewords, then the $ {2}$-rank is $ {16}$). I verified this by computer, but I am sure that there are more clever ways of doing so. Now let $ {B}$ the submatrix of $ {A_{MB3}}$ indexed by $ {Y}$. Then

$ \displaystyle B = J – I. $

It is easy to see that the $ {p}$-rank of $ {B}$ is $ {|Y|-1}$ if $ {p}$ divides $ {|Y|-1}$ and $ {|Y|}$ otherwise. As the $ {2}$-rank of $ {B}$ is at most the $ {2}$-rank of $ {A_{MB3}}$, we conclude that $ {|Y| \leq 9}$.

3. Concluding Remarks

Let me conclude with some remarks. First of all I want to mention that this took me longer to write than I thought. There are several reasons for this, but one of the main ones is that I actually wanted to make sure that none of the stated arguments could be used for the stated problems and general $ {n}$. Except for Proof 6 (here I do not know the $ {p}$-ranks in general) I am sure of this.

Proof 1 generalizes, see the paper by Huang, Klurman and Pohoata. Proof 2 generalizes as well, see this paper by Frankl and this paper by Hajime Tanaka and me, but not to all $ {n}$. It fails the first time at $ {n=23}$. I also used Proof 6 before in joint work with Peter Sin and Qing Xiang. But that might deserve its own post as it uses some additional representation theory trick and lives in a more complicated graph related polar spaces.

 

Follow the blog!

2 thoughts on “Six Spectral Bounds

Leave a Reply

Your email address will not be published. Required fields are marked *