Alexandre Rybko, Senya Shlosman, Alexander Vladimirov
Spontaneous Resonances and the Coherent States of the Queuing Networks
The theory of the phase transition provides, among many results, a positive
answer to the question about the possibility of constructing reliable systems
from non-reliable elements. As an example, consider the infinite volume
stochastic Ising model at low temperature T in dimension more or equal than two. As is well known, if we start this system from the configuration of
all pluses, then the evolution under Glauber dynamics has the property that
the fraction of plus spins will at any time be exceeding (1+m(T))/2 which is bigger than 1/2 for T below T critical.(Here m(T) is the spontaneous magnetization.) On the
other hand, if we consider finite volume Ising model (with empty boundary
condition, say), then this property does not hold, and, started from the all
plus state, the system at some later (random) times will be found in the state
with the majority of the spins to be minuses. Therefore, the infinite system
can remember, to some extent, its initial state, while the finite system can not.
There are many other examples of that kind, which belong to the theory of
interacting particle systems, such as voter model, contact model, etc. In all
these examples we see systems, which are capable of "remembering"
their initial state for arbitrary long times.
We are constructing a particle system which is
"remembering its initial phase". The rough
analogy can be described as follows. Imagine a brownian particle
f(t) with a unit drift, which lives on a circle.
Suppose the initial phase f(t)=0. Then the mean phase
E(f(t))=t mod(2pi) but with time we know the phase f(t)
less and less precisely, since its variance grows, and in the limit
when t tends to infinity, the distribution of f(t) tends to
uniform. However, one can combine infinitely many such particles, by
introducing suitable interaction between them, in such a way that the memory
of the initial phase will not vanish and will persist in time.
This is roughly what we will do in the present paper. We will consider a
network of simple servers, which are processing messages. Since the service
time of every message is random, in the course of time each single server
looses the memory of its initial state. So, in particular, the network of
non-interacting servers, started in the same state, would become
de-synchronized after a finite time. However, if one introduces certain
natural interconnection between servers, then it can happen that they are
staying synchronized after an arbitrary long time, thus breaking some
generally believed properties of large networks. We have to add here that such
a phenomenon is possible only if the mean number of particles per server is
high enough; otherwise the infinite network becomes de-synchronized, no matter
what interaction between servers is taking place. So the parameter of the mean
number of particles per server, called hereafter the load, plays here
the same role as the temperature in the statistical mechanics.
In other words, the transition we describe happens due to the fact that at low
load the behavior of our system is governed by the fixed point of the
underlying dynamical system, while at high load the dominant role is played by
its periodic attractor. A similar phenomenon was described by Hepp and Lieb.
Below we present the simplest example of the above behavior. But we believe
that the phenomenon we describe is fairly general. Its origin lies in the fact
that any large network of the general type possesses in the infinite limit
some kind of the continuous symmetry, and it is the breaking of that symmetry
at high load which causes the long-range order behavior of the network.