MathJax

Saturday, February 14, 2015

Entropy maximization, part II

Let \(\mathcal{X} \in \{\alpha_1, \cdots, \alpha_n\}\) be a random variable with finite alphabet, what is the distribution that will achieve maximum entropy given the constraint that \(E[f(\mathcal{X})]=\beta\) ?

We define the expectation
\[\sum_i \alpha_i p_i = \beta \] The optimization problem is given as
\[\max_\mathbf{p} H(\mathbf{p}) \; s.t., \; \sum_i p_i = 1,\; \sum_i \alpha_i p_i = \beta\] Optimizing using Lagrange multipliers \(\lambda\) and \(\mu\), we have
\[ p_i = exp^{-(1-\lambda)}exp^{\mu \alpha_i}\] which turns out to be the familiar expression
\[p_i = \frac{1}{Z}exp^{\mu \alpha_i}\] with \(Z=\sum_i exp^{\mu \alpha_i}\) being the partition function.

The above form is called a Gibbs distribution.

Now consider the problem where with \(\mathcal{X} \in \mathbb{R}^p\) and we are given the moments \(E[f_j(\mathcal{X})]=\beta_j, j=1,\dots,p\).  The optimization problem becomes
\[\max_\mathbf{p} H(\mathbf{p}) \; s.t., \; \sum_i p_i = 1,\; \sum_i \alpha_{ij} p_i = \beta_j\] The expression have the form
\[p_i = \frac{1}{Z} exp^{\sum_{j=1}^p \mu_j \alpha_{ij}}
         =\frac{1}{Z} exp^{\mathbf{\mu}^T \mathbf{\alpha_{i}}}\] Since the distribution in general has infinite support the constraint on the moment will allow one to reach a unique solution.

No comments: