# Tightness

\subsection{Tightness for Non-Convex Potentials} \label{TightnessSection}

Recall that we are looking to find a measure which (in some way) defines the limit of the sequence $$\{\mu_N^x\}$$ of finite-box gradient Gibbs measures. By Prokhorov's theorem, if we can prove that the sequence $$\{\mu_N^x\}$$ is tight (with reference to the suitable topological space) then there is a subsequence $$\{\mu^x_{N_k}\}_{k = 1}^\infty$$, and a measure $$\mu^x$$ such that the subsequence converges weakly, $$\lim_{k \rightarrow \infty} \mu_{N_k}^x = \mu^x$$. Whilst the definition of $$f$$ does not directly require a limiting measure to exist, the proof of Lemma \ref{freeEnergyEq} relies heavily on such existence, in that we use the factorisation \eqref{Prod} of $$\mu^x$$. Throughout this section we talk of `showing tightness for the potential $$V$$', by which we mean showing tightness for the relevant Gibbs distribution. Our aim is to show that if a potential is bounded by Gaussian potentials then it is tight. That is we enforce the following condition: there exist constants $$c_1, c_2, c_3, c_4 > 0$$ such that

%For a fimily of of measures $$\mathcal{F}\in \mathcal{M}_{f}(E)$$ on $$\mathbb{R}$$ it is sufficient to show: For any

%$$\epsilon > 0$$, there exists a $$L >0$$ such that

%\begin{equation*}

% \sup_{\mu \in \mathcal{F}} \mu(|X|>L)<\epsilon

%\end{equation*}

%For measuers on $$\mathbb{R}^{N}$$ it is engouph to show the marginals satisfy the above condition.

\label{eq:GaussianBoundCondition}

-c_{1}+c_{2}\StandardArgument^{2}

\leq

V(\StandardArgument)

\leq

c_{3}+c_{4}\StandardArgument^{2}.

\begin{remark}\label{extraTightness} \normalfont

Whilst we prove tightness for potentials satisfying this condition, we conjecture that it holds for any potential with sufficient growth at the tails, since tightness requires configurations with large gradients to have low probability mass. Given the result of Lemma~\ref{TightLemma}, we conjecture that any potential with faster than Gaussian growth also produces a tight sequence of measures. It is also generally thought that super-linear growth is sufficient to guarantee tightness, an assumption which will be made use of when we consider the double well potential in section \ref{potentialExamples}.

\end{remark}

Note that choosing $$V$$ to satisfy the upper and lower bounds in \eqref{eq:GaussianBoundCondition}, then the partition functions are Gaussian integrals, and are finite. Hence there are constants $$K_1, K_2 \in \mathbb R$$ such that for any potential $$V$$ satisfying \eqref{eq:GaussianBoundCondition}

\label{eq:BoundedPartitionFunction}

K_{1}

\leq \liminf_{N\to\infty}\frac{1}{N}\log{\PartitionNU}

\leq \limsup_{N\to\infty}\frac{1}{N}\log{\PartitionNU}

\leq

K_{2}.

So it suffices to prove tightness for any $$V$$ satisfying this condition.

\begin{lemma}\label{TightLemma}

Let $$V: \mathbb R \rightarrow \mathbb R$$ satisfy \eqref{eq:BoundedPartitionFunction}; then there exist $$\epsilon,K>0$$ such that

\label{eq:SufficientForTightness}

\GibbsDistNU

\left(

%\frac{1}{N^{d}} \SummOfSquarsOfDifferences

\HelpingSquareSumInThightness

\geq

K

\right)

\leq e^{-\epsilon N},

and subsequently $$\{\mu_N^x\}_{N \in \mathbb N}$$ is tight.

\end{lemma}

\begin{proof}

Before proving that \eqref{eq:SufficientForTightness} holds, we show that this condition is sufficient for tightness. For a given bond $$\BondA$$, translation invariance of $$\GibbsDistNU$$ allows us to rewrite $$\GibbsDistNU(|\StandardArgument_{\BondA}|\geq L)$$ as:

\begin{equation*}

\GibbsDistNU(|\StandardArgument_{\BondA}|\geq L)=

\Expectation_{ \GibbsDistNU}\left[

\Indicator_{|\StandardArgument_{\BondA}|\geq L}

\right]

=

\HelpingExpectInTightness,

\end{equation*}

where $$\Indicator_A$$ is the indicator function of an event $$A$$. Now for any $$\delta>0$$

\begin{align*}

\HelpingExpectInTightness=&

\Expectation_{ \GibbsDistNU}\left[\Indicator_{\HelpingIndicatorSumInTightnessL>\delta}\HelpingIndicatorSumInTightnessL \right]\\

&+

\Expectation_{ \GibbsDistNU}\left[\Indicator_{\HelpingIndicatorSumInTightnessL \leq \delta}\HelpingIndicatorSumInTightnessL \right].

\end{align*}

Since $$\HelpingIndicatorSumInTightnessL\leq 1$$ and

$$\Indicator_{|\StandardArgument_{\BondA}|\geq L}\Indicator_{|\StandardArgument_{\BondA}|\geq L}\leq\Indicator_{|\StandardArgument_{\BondA}|\geq L} \StandardArgument_{\BondA}^{2}/ L^{2}$$ we have that

\begin{equation*}

\HelpingExpectInTightness

\leq

\delta+

{ \GibbsDistNU}\left[\HelpingIndicatorSumInTightnessL>\delta \right]

\leq

\delta+

{\GibbsDistNU}\left[\frac{1}{N}\sum_{\Bond\in\TorusN}\StandardArgument_{\BondA}^{2}\geq \delta L^{2}\right].\end{equation*}

It follows that

\begin{equation*}

\GibbsDistNU(|\StandardArgument_{\BondA}|\geq L)\leq

\delta+

{\GibbsDistNU}\left[\frac{1}{N}\sum_{\Bond\in\TorusN}\StandardArgument_{\BondA}^{2}\geq \delta L^{2}\right],

\end{equation*}

and setting $$\delta=K / L^2$$

\GibbsDistNU(|\StandardArgument_{\BondA}|\geq L)\leq

\frac{K}{L^{2}}+e^{-\epsilon N}.

By adjusting $$L$$, one sees that the right handside can be chosen to be abritrary small, independently of $$N$$.

We now return to show \eqref{eq:SufficientForTightness}. For $$0<\delta< c_{2}/2$$, define

\begin{equation*}

\HamiltonianNUDelta(\HamiltionanArgument)=\HamiltonianNU(\HamiltionanArgument)-\delta\HelpingSquareSumInThightness.

\end{equation*}

Given

\begin{equation*}

% \PartitionNUDelta=

\int e^{-\HamiltonianNU(\HamiltionanArgument)+\delta\HelpingSquareSumInThightness}

=\PartitionNU\int\frac{1}{\PartitionNU}e^{-\HamiltonianNU(\HamiltionanArgument)+\delta\HelpingSquareSumInThightness}

=\PartitionNU \Expectation_{ \GibbsDistNU}\left[ e^{\delta\HelpingSquareSumInThightness}\right],

\end{equation*}

it follows that

\begin{equation*}

\Expectation_{ \GibbsDistNU}\left[ e^{\delta\HelpingSquareSumInThightness}\right]=\frac{ \PartitionNUDelta}{\PartitionNU }.

\end{equation*}

Combining this with \eqref{eq:GaussianBoundCondition} and \eqref{eq:BoundedPartitionFunction} we obtain,

\begin{equation*}

\limsup_{N\to\infty} \frac{1}{N}\log \Expectation_{ \GibbsDistNU}\left[ e^{\delta\HelpingSquareSumInThightness}\right]\leq K_{3}(\delta)

\end{equation*}

where $$K_{3}(\delta)<\infty$$. Now, using the Markov inequality

\begin{equation*}

{ \GibbsDistNU}\left[ {\delta\HelpingSquareSumInThightness}\geq \delta K\right]=

{ \GibbsDistNU}\left[ e^{\delta\HelpingSquareSumInThightness}\geq e^{\delta K}\right]

\leq \frac{\Expectation_{ \GibbsDistNU}\left[ e^{\delta\HelpingSquareSumInThightness}\right]}{ e^{\delta K}}

\end{equation*}

from which it follows that

\begin{equation*}

\limsup_{N\to\infty} \frac{1}{N}\log { \GibbsDistNU}\left[ {\delta\HelpingSquareSumInThightness}\geq \delta K\right]\leq -(K\delta-K_{3}(\delta)).

\end{equation*}

\eqref{eq:SufficientForTightness} follows by choosing $$K$$ and $$\epsilon$$ such that $$K > \delta^{-1} K_3(\delta)$$, and $$\epsilon = K \delta - K_3(\delta)$$

%Pluging this in:

%\begin{equation*}

% \limsup_{N\to\infty} \frac{1}{N^{d}}\log { \GibbsDistNU}\left[ {\HelpingSquareSumInThightness}\geq K\right]\leq -\epsilon

%\end{equation*}

\end{proof}

Back to Nearest Neighbour Interactions