Let
\(G\) be a bipartite graph with bipartition
\((A,B)\text{.}\) Let
\(M\) be a perfect matching in
\(G\) chosen uniformly at random from the set of all perfect matchings. Then, by
Lemma 7.1.4, we have that the entropy of
\(M\) is equal to the logarithm of the cardinality of its range. Therefore,
\begin{equation*}
\mathbb{H}(M) = \log_2(\permat(G))\text{.}
\end{equation*}
So, proving an upper bound on \(\permat(G)\) is equivalent to proving an upper bound on \(\mathbb{H}(M)\text{;}\) we focus on the latter.
For each \(v\in A\text{,}\) let \(X_v\) be equal to the vertex \(u\in B\) such that \(vu\in M\text{.}\) Note that the joint random variable
\begin{equation*}
X=(X_v: v\in A)
\end{equation*}
is equivalent to \(M\text{;}\) that is, \(X\) is determined by \(M\) and \(M\) is determined by \(X\) as well. Therefore, \(\mathbb{H}(X)=\mathbb{H}(M)=\log_2(\permat(G))\) and we can focus on bounding the entropy of \(X\text{.}\) For each \(u\in B\text{,}\) we can similarly define \(Y_u\) to be the vertex of \(A\) such that \(Y_uu\in M\text{.}\)
For any total order
\(\prec\) on
\(A\text{,}\) we have, by
Lemma 7.1.9,
\begin{equation}
\mathbb{H}(X) = \sum_{v\in A}\mathbb{H}(X_v\mid X_w, w\prec v)\text{.}\tag{7.3.1}
\end{equation}
Given a vertex \(v\in A\) and a total order \(\prec\) on \(A\text{,}\) we define the following random variable
\begin{equation*}
R_{v}^\prec:=\left|N(v)\setminus \{u\in B: Y_u\prec v\}\right|\text{.}
\end{equation*}
Note that
\(R_{v}^\prec\) is determined by
\((X_w: w\prec v)\text{.}\) So, by
Lemma 7.1.12 and
Lemma 7.1.11 and the definition of conditional entropy,
\begin{equation*}
\mathbb{H}(X_v\mid X_w: w\prec v)= \mathbb{H}(X_v\mid (X_w: w\prec v), R_{v}^\prec)\leq \mathbb{H}(X_v\mid R_{v}^\prec)=\sum_{k=1}^{d(v)}\mathbb{P}(R_v^\prec=k)\mathbb{H}(X_v\mid R_v^\prec=k)\text{.}
\end{equation*}
When
\(R_v^\prec=k\text{,}\) then the number of choices for
\(X_v\) is at most
\(k\text{.}\) So, by
Lemma 7.1.4, we have
\(\mathbb{H}(X_v\mid R_v^\prec=k)\leq \log_2(k)\) and so
\begin{equation*}
\mathbb{H}(X_v\mid X_w: w\prec v)\leq \sum_{k=1}^{d(v)}\mathbb{P}(R_v^\prec=k)\log_2(k)\text{.}
\end{equation*}
Plugging this back into equation
(7.3.1) and summing over all choices of
\(\preceq\) yields
\begin{equation*}
\mathbb{H}(X)\leq\frac{1}{n!}\sum_{\prec}\sum_{v\in A}\sum_{k=1}^{d(v)}\mathbb{P}(R_v^\prec=k)\log_2(k)=\frac{1}{n!}\sum_{v\in A}\sum_{k=1}^{d(v)}\log_2(k)\sum_{\prec}\mathbb{P}(R_v^\prec=k)
\end{equation*}
\begin{equation*}
=\frac{1}{n!}\sum_{v\in A}\sum_{k=1}^{d(v)}\log_2(k)\sum_{\prec}\sum_{M}\frac{1_{\{R_v^\prec=k\}}}{\permat(G)}
\end{equation*}
\begin{equation*}
=\frac{1}{\permat(G)}\sum_{v\in A}\sum_{k=1}^{d(v)}\log_2(k)\sum_{M}\sum_{\prec}\frac{1_{\{R_v^\prec=k\}}}{n!}
\end{equation*}
\begin{equation*}
=\frac{1}{\permat(G)}\sum_{v\in A}\sum_{k=1}^{d(v)}\log_2(k)\sum_{M}\frac{1}{d(v)}
\end{equation*}
\begin{equation*}
=\sum_{v\in A}\sum_{k=1}^{d(v)}\log_2(k)\frac{1}{d(v)}
=\sum_{v\in A}\frac{\log_2(d(v)!)}{d(v)}.
\end{equation*}
Therefore,
\begin{equation*}
\permat(G)\leq \prod_{v\in A}(d(v)!)^{1/d(v)}\text{.}
\end{equation*}