class InferNetTest
{
Variable<double> _x = Variable.GaussianFromMeanAndVariance(0, 1);
InferenceEngine _engine = new InferenceEngine();
Variable<bool> _a;
Variable<bool> _b;
Variable<bool> _c;
public InferNetTest()
{
_a = _x > 0;
_b = _x > 0;
//Variable.ConstrainEqual(_a, _b);
_c = _a;
}
public void Infer()
{
_a.ObservedValue = true;
Console.WriteLine(_engine.Infer(_x));
Console.WriteLine(_engine.Infer(_b));
Console.WriteLine(_engine.Infer(_c));
}
}

If I do this, it tells me _x is Gaussian<0.7979, 0.3634> and _b is Bernoulli<0.9072>. _b is obviously wrong since `>` is a deterministic function. But here I can understand this, because the inference engine has to infer _x first with approximate
gaussian, and then infer _b with _x 's posterior.

However, when I added Variable.ConstrainEqual(_a, _b), I got _x is of Gaussian<0.8527, 0.2729>. Now I don't understand, why _x is different from the previous Gaussian<0.7979, 0.3634>?