Converse of Chernoff bound

Given n independent random variables X1X_1, X2X_2, …Xn\ldots X_n taking values in range 0 0 to 11, Chernoff bound can be stated as follows. Define the random variable X=X1+X2+…Xn−⟨X1⟩−⟨X2⟩−⟨Xn⟩ X = X_1+X_2+\ldots X_n – \left \langle X_1 \right \rangle – \left \langle X_2\right \rangle – \left \langle X_n \right \rangle . Then the moment generating function satisfies ⟨etX⟩≤ec⋅nt2 \left \langle e^{tX} \right \rangle \leq e^{c\cdot nt^2} for all t>0t>0, where cc is a constant. Here ⟨.⟩ \left \langle . \right \rangle denotes expectation.

Is there a converse to this in the following sense? Consider n random variables X1X_1, X2X_2, …Xn\ldots X_n distributed according to joint probability distribution P(x1,x2,…xn)P(x_1,x_2,\ldots x_n). Define the random variable X=X1+X2+…Xn−⟨X1⟩−⟨X2⟩−⟨Xn⟩ X = X_1+X_2+\ldots X_n – \left \langle X_1 \right \rangle – \left \langle X_2\right \rangle – \left \langle X_n \right \rangle , where expectation is taken using probability distribution PP. If it is known that
⟨etX⟩≤ed⋅nt2 \left \langle e^{tX} \right \rangle \leq e^{d\cdot nt^2} for some constant dd and all t>0t>0, then is PP close to a product distribution Q1(x1)Q2(x2)…Qn(xn)Q_1(x_1)Q_2(x_2)\ldots Q_n(x_n)?

If there is a known result using some other characterisation of Chernoff bound (such as using moments, instead of moment generating function), then that’s good as well.

=================

=================

=================