# Confused about Constraints • ### Question

• Hi!

I've been using Infer.NET quite a bit, but Constraints still puzzle me a bit from the mathematic standpoint. As a concrete example, I was trying to learn a model where the likelihood is defined by a softmax, whose parameters are specified by a product of two random variables. One random variable has a Gaussian Prior, and I had a requirement that the other random variable was strictly non-negative. The obvious choice was a Gamma prior, but a GaussianGamma product does not work with VMP (neither did Truncated Gaussian for one of the variables). But a simple ConstrainPositive on one of the random variables does the trick. I am having trouble wrapping my head around what this constraint does during inference from a mathematic standpoint. My understanding is that this is a constraint directly on the posterior for the variable that I want to be positive, rather than on the prior. In general, I don't quite understand how constraints can be made to work (enforced) in traditional bayesian inference.

Another example that confuses me is the following: In the same example as above, I have several normally distributed random variables, on which I'd like like to impose a partial order, i.e. one of the random variables has to be strictly greater than all the others. My understanding is that ConstrainTrue(A>B) should do the trick, but this also leads to issues with VMP (but I need VMP for softmax). My solution is to introduce additional boolean random variables via the BernoulliFromLogOdds factor on the difference between normally-distributed RVs and then give an observed value to each of those boolean variables. I can control the degree to which this "constraint" is satisfied by ramping up the coefficient in the BernoulliLogOdds factor. My question is still mostly mathematical: I understand how observations in a generative model enforce constraints "softly" -- but what does a constraint like ConstrainTrue(A>B) really mean in BayesianInference, is this a hard constraint on the posterior?

Any pointers would be greatly appreciated! Thank you!

Wednesday, October 14, 2015 9:55 AM

### All replies

• A constraint is not applied to the prior or to the posterior, it is applied to the variable.  A useful way to think about constraints is in terms of rejection sampling.  Consider sampling all variables from their priors, then rejecting samples that violate any constraint or observation.  The distribution of the kept samples is the posterior.
Wednesday, October 14, 2015 12:26 PM
• Let me just add to what Tom has said. I personally find ConstrainEqualRandom very different from the other constraints, because it gives rise to a separate set of challenges, like obtaining the marginal divided by the prior and working with it. An example of this is given in the post with the pictures here. Other than that, how to attach constraints to variables is explained here, and a list of all constraints is given here.

-Y-

Wednesday, October 14, 2015 2:38 PM