Let \(V(G)=A\sqcup B\) be the bipartition of \(G\text{.}\) Since \(G\) is \(d\)-regular and \(d\geq1\text{,}\) we have \(|A|=|B|=n/2\text{.}\) Choose a proper \(q\)-colouring \(f:V(G)\to [q]\) uniformly at random and, for each vertex \(v\text{,}\) let \(X_v:=f(v)\text{.}\) Let \(X=(X_v:v\in V(G))\text{.}\) Then
\begin{equation*}
\mathbb{H}(X)=\log_2(\operatorname{col}_q(G)).
\end{equation*}
For a set of vertices \(U\subseteq V(G)\text{,}\) write \(X_U\) for the random vector \((X_u:u\in U)\text{.}\) By the chain rule,
\begin{equation*}
\mathbb{H}(X)=\mathbb{H}(X_A)+\mathbb{H}(X_B\mid X_A).
\end{equation*}
We first bound \(\mathbb{H}(X_A)\text{.}\) Apply Shearer’s Lemma to the random variables \((X_a:a\in A)\) and to the collection of subsets
\begin{equation*}
\{N(v):v\in B\}.
\end{equation*}
Each vertex of \(A\) appears in exactly \(d\) of these neighbourhoods. Hence
\begin{equation*}
\mathbb{H}(X_A)\leq \frac{1}{d}\sum_{v\in B}\mathbb{H}(X_{N(v)}).
\end{equation*}
Next, we bound \(\mathbb{H}(X_B\mid X_A)\text{.}\) By subadditivity of conditional entropy and the fact that conditioning on more information cannot increase entropy,
\begin{equation*}
\mathbb{H}(X_B\mid X_A)\leq \sum_{v\in B}\mathbb{H}(X_v\mid X_A)
\leq \sum_{v\in B}\mathbb{H}(X_v\mid X_{N(v)}).
\end{equation*}
Therefore
\begin{equation}
\mathbb{H}(X)\leq
\sum_{v\in B}
\left[
\frac{1}{d}\mathbb{H}(X_{N(v)})
+
\mathbb{H}(X_v\mid X_{N(v)})
\right].\tag{7.3.1}
\end{equation}
We would like to combine the two terms within the sum. For this, it is useful to introduce the random set
\begin{equation*}
S_v:=\{X_u:u\in N(v)\}
\end{equation*}
of colours that appear on \(N(v)\text{.}\) Since \(S_v\) is determined by \(X_{N(v)}\text{,}\) the pair \((X_{N(v)},S_v)\) contains exactly the same information as \(X_{N(v)}\text{.}\) Therefore, by the chain rule,
\begin{equation*}
\mathbb{H}(X_{N(v)})
=
\mathbb{H}(X_{N(v)},S_v)
=
\mathbb{H}(X_{N(v)}\mid S_v)+\mathbb{H}(S_v).
\end{equation*}
Also, since conditioning on more information cannot increase entropy,
\begin{equation*}
\mathbb{H}(X_v\mid X_{N(v)})\leq \mathbb{H}(X_v\mid S_v).
\end{equation*}
\begin{equation*}
\mathbb{H}(X)\leq
\sum_{v\in B}
\left[
\frac{1}{d}\left(\mathbb{H}(X_{N(v)}\mid S_v)+\mathbb{H}(S_v)\right)
+
\mathbb{H}(X_v\mid S_v)
\right].
\end{equation*}
Since \(S_v\) is a subset of \([q]\text{,}\) we have
\begin{equation*}
\mathbb{H}(S_v)\leq \log_2(2^q)=q.
\end{equation*}
By the definition of conditional entropy, we have
\begin{equation*}
\mathbb{H}(X_{N(v)}\mid S_v)
=
\sum_{S\subseteq[q]}\mathbb{P}(S_v=S)\,
\mathbb{H}(X_{N(v)}\mid S_v=S)
\end{equation*}
and
\begin{equation*}
\mathbb{H}(X_v\mid S_v)
=
\sum_{S\subseteq[q]}\mathbb{P}(S_v=S)\,
\mathbb{H}(X_v\mid S_v=S).
\end{equation*}
Therefore
\begin{equation*}
\begin{aligned}
\mathbb{H}(X)
&\leq
\sum_{v\in B}
\left[
\frac{q}{d}
+
\sum_{S\subseteq[q]}
\left(
\frac{1}{d}\mathbb{H}(X_{N(v)}\mid S_v=S)
+
\mathbb{H}(X_v\mid S_v=S)
\right)
\mathbb{P}(S_v=S)
\right].
\end{aligned}
\end{equation*}
Now fix
\(v\in B\) and
\(S\subseteq[q]\text{.}\) If
\(\mathbb{P}(S_v=S)=0\text{,}\) then the corresponding term contributes nothing, so suppose
\(\mathbb{P}(S_v=S)>0\text{.}\) Let
\(k=|S|\text{.}\) Given that
\(S_v=S\text{,}\) every vertex in
\(N(v)\) receives a colour from
\(S\text{,}\) so there are at most
\(k^d\) possible values of
\(X_{N(v)}\text{.}\) Also, since
\(f\) is a proper colouring, the colour
\(X_v\) must lie in
\([q]\setminus S\text{,}\) so there are at most
\(q-k\) possible values of
\(X_v\text{.}\) Thus, by
Lemma 7.1.4,
\begin{equation*}
\mathbb{H}(X_{N(v)}\mid S_v=S)\leq \log_2(k^d)
\end{equation*}
and
\begin{equation*}
\mathbb{H}(X_v\mid S_v=S)\leq \log_2(q-k).
\end{equation*}
It follows that
\begin{equation*}
\frac{1}{d}\mathbb{H}(X_{N(v)}\mid S_v=S)
+
\mathbb{H}(X_v\mid S_v=S)
\leq
\log_2 k+\log_2(q-k)
=
\log_2(k(q-k)).
\end{equation*}
The quantity \(k(q-k)\) is maximized when \(k\) is as close as possible to \(q/2\text{.}\) Hence, for every possible value of \(S\text{,}\)
\begin{equation*}
k(q-k)\leq \left\lfloor q/2\right\rfloor\lceil q/2\rceil.
\end{equation*}
Therefore
\begin{equation*}
\begin{aligned}
\mathbb{H}(X)
&\leq
\sum_{v\in B}
\left[
\frac{q}{d}
+
\sum_{S\subseteq[q]}
\log_2\left(\lfloor q/2\rfloor\lceil q/2\rceil\right)
\mathbb{P}(S_v=S)
\right]\\
&=
\sum_{v\in B}
\left[
\frac{q}{d}
+
\log_2\left(\lfloor q/2\rfloor\lceil q/2\rceil\right)
\right]\\
&=
\frac{n}{2}
\left[
\frac{q}{d}
+
\log_2\left(\lfloor q/2\rfloor\lceil q/2\rceil\right)
\right].
\end{aligned}
\end{equation*}
Since \(\mathbb{H}(X)=\log_2(\operatorname{col}_q(G))\text{,}\) exponentiating gives
\begin{equation*}
\operatorname{col}_q(G)
\leq
2^{qn/(2d)}
\left(\lfloor q/2\rfloor\lceil q/2\rceil\right)^{n/2}.
\end{equation*}
This proves the stated bound.