Hi!
I've been using Infer.NET quite a bit, but Constraints still puzzle me a bit from the mathematic standpoint. As a concrete example, I was trying to learn a model where the likelihood is defined by a softmax, whose parameters are specified by a product of
two random variables. One random variable has a Gaussian Prior, and I had a requirement that the other random variable was strictly non-negative. The obvious choice was a Gamma prior, but a GaussianGamma product does not work with VMP (neither did Truncated
Gaussian for one of the variables). But a simple ConstrainPositive on one of the random variables does the trick. I am having trouble wrapping my head around what this constraint does during inference from a mathematic standpoint. My understanding is that
this is a constraint directly on the posterior for the variable that I want to be positive, rather than on the prior. In general, I don't quite understand how constraints can be made to work (enforced) in traditional bayesian inference.
Another example that confuses me is the following: In the same example as above, I have several normally distributed random variables, on which I'd like like to impose a partial order, i.e. one of the random variables has to be strictly greater than all
the others. My understanding is that ConstrainTrue(A>B) should do the trick, but this also leads to issues with VMP (but I need VMP for softmax). My solution is to introduce additional boolean random variables via the BernoulliFromLogOdds factor on the
difference between normally-distributed RVs and then give an observed value to each of those boolean variables. I can control the degree to which this "constraint" is satisfied by ramping up the coefficient in the BernoulliLogOdds factor. My question
is still mostly mathematical: I understand how observations in a generative model enforce constraints "softly" -- but what does a constraint like ConstrainTrue(A>B) really mean in BayesianInference, is this a hard constraint on the posterior?
Any pointers would be greatly appreciated! Thank you!