In the static flooding problem, we are given a graph $G=(V,E)$ and one node $v\in V$ wants to dissipate some information to all other nodes in $G$. We consider this problem in the distributed dynamic setting.
It is known that flooding requires $\Omega(n)$ rounds to complete in static graphs (consider the line graph). But we can parameterize this by the static graph diameter $D$ instead since most graphs have $n\geq D$. In this case we need $\Theta(D)$ rounds.
However, in the dynamic graph setting we can construct a graph $\mathcal{G}$ where the diameter is constant, but we still require $\Omega(n)$ rounds for flooding to complete. So we can no longer parameterize the round complexity in terms of $D$.
We will aim to give a smoothed lower bound for flooding, where the proof presented here is taken from [Din+15].
Section 1: The spooling graph.
We want a dynamic graph where even though the diameter is constant, flooding requires $\Omega(n)$ rounds to complete. Such a graph is called the spooling graph.
For $1\leq r\leq n-1$, the $r$-spooling graph is a graph consisting of a left star of nodes $\{1, ..., r\}$ centered at $r$ and a second right star of nodes $\{r+1, ..., n\}$ centered around $r+1$ with the edge $\{r, r+1\}$ connecting the two stars. Clearly this graph has constant diameter $3$ across rounds.
Consider the case where node $1$ starts with the message to be flooded. Then we need $\Omega(n)$ rounds for flooding to complete since only one node learns the message per round.
Shows first 3 rounds of flooding from node 1 on the spooling graph.
Our task is to prove that the $\Omega(n)$ lower bound can be improved (with some probability) by smoothing our worst-case dynamic graph.
Section 2: The smoothed lower bound.
For our lower bound, we assume that the process of $k$-smoothing can only add edges. This is because if we need $\Omega(R)$ rounds when only adding edges, then we certainly need $\Omega(R)$ edges when both adding and removing edges.
We want to show that given a smoothed spooling graph, we need at least $R$ rounds for flooding to complete with some probability. Then we can say that flooding has a $\Omega(R)$ smoothed lower bound. For the flooding to last at least $R$ rounds, we want to show that it is likely that in the first $R-1$ rounds at least one node will not yet have received the message. A node $v$ can learn the flooded message in one of the following ways in round $r$:
- $v$ is numbered $r+1$ and will therefore learn the message from the node numbered $r$,
- there is a smoothed edge between some node in the left star and $v$ in the right star,
- $r+1$ knows the message and $v$ is in the right star.
If case $(1)$ occurs, we don't really mind, since only a single node will learn the message that round. This cannot be avoided and all nodes $x\in[1, R]$ will learn the message in this way.
On the other hand, if case $(3)$ occurs even once, say in round $r$, then we know that by the end of round $r$ all nodes will have learned the message. Thus case $(3)$ is said to fail if it occurs even once.
If case $(2)$ happens too often, then we risk all nodes in the right star learning the message. Thus we say that case $(2)$ fails if more than $R$ nodes learn the message in this way. We must be careful to eventually pick an $R$ which satisfies $2R<n$ for sufficiently large $n$, so that at least one node will not have received the message if case $(2)$ does not fail.
Let's try to upper bound the probability that $(3)$ happens in the first $R-1$ rounds.
Obs 1: For $(3)$ to happen, we need some node numbered $x\in[1, R]$ to learn the message early (i.e. in a round $r<x-1$) via a smoothed edge.
Consider each round $r<R$ where case $(2)$ and $(3)$ have not failed yet. Then we define the set $S$ to be the set of edges between a node in the left star (has the message) to a node in $[r+2, R]$. Note that $|S|=O(R^2)$.
We use the following lemma (discussed in previous blog post) to help us bound the probability that a smoothed edge as described in observation $1$ is added.
Lemma 1: There exists constant $c_3 > 0$ such that the following holds. Consider any graph $G = (V, E)\in\text{allowed}(\mathcal{G}_{\text{conn}})$. Consider also any set $S$ of edges and smoothing value $k \leq n/16$ such that $S \cap E = \emptyset$. Then with probability at most $c_3k|S|/n^2$ , the $k$-smoothed graph $G'$ of $G$ contains an edge from $S$.
Thus, the probability of adding an edge between a node with the message and a node $x$ as described in observation $1$ is $\leq c_3k|S|/n^2=O(kR^2/n^2)$. We then take a union bound over the $R$ rounds, so that the probability of $(3)$ occurring is $O(kR^3/n^2)$. If we let $R=\alpha n^{2/3}/k^{1/3}$ for some $\alpha\leq 1$, then we get $O(\alpha^3)$, which for sufficiently small $\alpha$ is at most $1/4$.
Note also that for sufficiently large $n$, we have $2R=2\alpha n^{2/3}/k^{1/3} < n$, which was mentioned above to be necessary.
Now we want to upper bound the probability of case $(2)$ failing, i.e. the probability that more than $R$ nodes learn the message by case $(2)$. If we get the expected number of nodes learning the message by case $(2)$ over the $R-1$ rounds, then we can use Markov's inequality to upper bound the probability of case $(2)$ failing.
Consider each round $r<R$ where cases $(2)$ and $(3)$ have not yet failed. Therefore, up to $2R$ nodes know the message. Fix a node in the right star, say $v$, then let $S=\{\{u, v\} | u\in\text{ left star }\}$. By lemma 1, the probability that node $v$ learns the message by case $2$ is $\leq c_3k(2R)/n^2=O(kR/n^2)$. Now, we introduce the following indicator variable.
$$X_u = 1 \text{ if }u\in\text{ right star and learns the message by case } (2) \text{ in round } r \text{ otherwise } 0$$
Then the expected number of nodes in the right star which will learn the message by case $(2)$ in round $r$ is given by
$$\mathbb{E}[\sum_{u\in\text{ right star}}{X_u}] = \sum_{u\in\text{ right star}}{E[X_u]} \leq O(n)\cdot O(kR/n^2)=O(kR/n)$$
Summing over the $R-1$ rounds and letting $R=\alpha n^{2/3}/k^{1/3}$, we have $O(kR^2/n)=O(\alpha^2kn^{1/3}k^{1/3})=O(\alpha R)$ expected nodes learning the message by case $(2)$, where the last equality holds for $k=O(\sqrt{n})$.
For sufficiently small $\alpha$, we have $R/4$ expected total number of nodes that learn the message by case $(2)$. We can now apply Markov's inequality.
Markov's Inequality: Let $X$ be a non-negative random variable. For every $a>0$ we have
$$\mathbb{P}(X\geq a) \leq \frac{E[X]}{a}$$
Hence we have
$$\mathbb{P}(\text{total number of nodes that learn the message by case } (2)\geq R) \leq E[\text{ total number of nodes that learn the message by case } (2)]/R \leq R/4R=1/4$$
Thus we fail case $(2)$ with probability at most $1/4$.
Putting everything together, we have at most $1/4+1/4=1/2$ probability of failing either case $(2)$ or case $(3)$. Therefore, we have probability at least $1/2$ of both case $(2)$ and case $(3)$ passing.
Theorem: Given a $k$-smoothed $n$-vertex spooling graph, with $k \leq \sqrt{n}$ and sufficiently large $n$, with probability at least $1/2$ flooding does not complete before round $\Omega(n^{2/3}/k^{1/3})$.
[Din+15] Michael Dinitz et al. “Smoothed Analysis of Dynamic Networks”.
In: Proceedings of the 29th International Symposium on Distributed
Computing - Volume 9363. DISC 2015. Tokyo, Japan: Springer-
Verlag, 2015, pp. 513–527. isbn: 9783662486528. doi: 10.1007/
978- 3- 662- 48653- 5_34. url: https://doi.org/10.1007/978-3-662-48653-5_34.