A counter-intuitive example of inference RRS feed

  • Question

  • class InferNetTest { Variable<double> _x = Variable.GaussianFromMeanAndVariance(0, 1); InferenceEngine _engine = new InferenceEngine(); Variable<bool> _a; Variable<bool> _b; Variable<bool> _c; public InferNetTest() { _a = _x > 0; _b = _x > 0; //Variable.ConstrainEqual(_a, _b); _c = _a; } public void Infer() { _a.ObservedValue = true; Console.WriteLine(_engine.Infer(_x)); Console.WriteLine(_engine.Infer(_b)); Console.WriteLine(_engine.Infer(_c)); } }

    If I do this, it tells me _x is Gaussian<0.7979, 0.3634> and _b is Bernoulli<0.9072>. _b is obviously wrong since `>` is a deterministic function. But here I can understand this, because the inference engine has to infer _x first with approximate gaussian, and then infer _b with _x 's posterior.

    However, when I added Variable.ConstrainEqual(_a, _b), I got _x is of Gaussian<0.8527, 0.2729>. Now I don't understand, why _x is different from the previous Gaussian<0.7979, 0.3634>?

    • Edited by colinfang Sunday, September 21, 2014 1:37 PM
    Sunday, September 21, 2014 1:37 PM


  • It depends on what inference algorithm you use.  Here you are using Expectation Propagation which approximates the posterior on x with a Gaussian.  This projection is done after each use of x in the program.  Both uses of x think that x has a Gaussian distribution at the time they are processed, so each will perform an update to the posterior.  The effect is that the greater-than-zero constraint counts more when you apply it twice in the program.
    • Marked as answer by colinfang Sunday, September 21, 2014 10:05 PM
    Sunday, September 21, 2014 4:02 PM