Answered by:
Ask a question about using shared variables to support hybrid algorithms (Migrated from community.research.microsoft.com)
Question

shengbo posted on 03302011 9:29 AM
Hi Team, Thanks a lot for your help with my previous queries. Now I have a question when using shared variables to support hybrid algorithms.
I defined a model for inferring the posteriors on the mean and precision on a multivariate Gaussian distribution. The observation can be seen as the pairwise comparisons of different dimension from the multivariate Gaussian (e.g., the 2nd is larger than the 4th one). Since EP is not supported for inferring the Wishart for the precision, the plan was to mix EP and VMP for the inference. Thus, I hope the following code can do the job:

int dimension = 5;
int i = 2;
int j = 4;
Model meanModel = new Model(1);
var sharedMean = SharedVariable<Vector>.Random(VectorGaussian.Uniform(dimension));
meanEpsilon = Variable.VectorGaussianFromMeanAndVariance(Vector.Zero(dimension), PositiveDefiniteMatrix.Identity(dimension));
sharedMean.SetDefinitionTo(meanModel, meanEpsilon);
Model precisionModel = new Model(1);
var sharedPrecision = SharedVariable<PositiveDefiniteMatrix>.Random(Wishart.Uniform(dimension));
precEpsilon = Variable.WishartFromShapeAndScale(1, PositiveDefiniteMatrix.Identity(dimension));
sharedPrecision.SetDefinitionTo(precisionModel, precEpsilon);
Model dataModel = new Model(1);
Variable<double>[] utility = new Variable<double>[dimension];
epsilon = Variable.VectorGaussianFromMeanAndPrecision(sharedMean.GetCopyFor(dataModel), sharedPrecision.GetCopyFor(dataModel));
utility[i] = Variable.GetItem(epsilon, i);
utility[j] = Variable.GetItem(epsilon, j);
Variable.ConstrainPositive(utility[i]  utility[j]);
InferenceEngine engineVMP = new InferenceEngine(new VariationalMessagePassing());
InferenceEngine engineEP = new InferenceEngine(new ExpectationPropagation());
for (int pass = 0; pass < 10; pass++)
{
meanModel.InferShared(engineEP, 0);
precisionModel.InferShared(engineVMP, 0);
dataModel.InferShared(engineEP, 0);
}

However, there is a problem when compiling the dataModel. Thanks to Minka's help, I understand that it can solve the problem by using the following constraint: Variable.ConstrainTrue(Variable.Bernoulli(Variable.Logistic(scale*(utility[i]  utility[j)))); But is there a way of mixing EP and VMP to infer the posterior mean and precision using Infer.net?
Thanks and regards,
Shengbo
Friday, June 3, 2011 6:43 PM
Answers

minka replied on 04012011 11:29 AM
I don't know of any papers which performed EP on VectorGausian with Wishart precision. This makes sense to me because EP does not give good results for a univariate Gaussian with Gamma precision. This implies that a VectorGaussian with Wishart precision would not work well either. A VectorGaussian with Wishart variance should work better, but again there are no papers on it.
 Marked as answer by Microsoft Research Friday, June 3, 2011 6:44 PM
Friday, June 3, 2011 6:44 PM
All replies

minka replied on 04012011 10:49 AM
This doesn't work because dataModel is using EP and EP does not yet support VectorGaussian with unknown precision. I think you meant for epsilon to be the shared variable?
Friday, June 3, 2011 6:43 PM 
shengbo replied on 04012011 11:14 AM
Thanks a lot for the reply! Yeah, epsilon is the shared variable, and it is a random vector drawn from VectorGaussian.
May I ask how to implement EP for VectorGassian with unknown precision under the current framework of Infer.net? Could you please point out some related papers or technical report for computing messages and marginals required when using EP for VectorGaussian with unknown precision?
Thank you very much and regards,
Shengbo
Friday, June 3, 2011 6:44 PM 
minka replied on 04012011 11:29 AM
I don't know of any papers which performed EP on VectorGausian with Wishart precision. This makes sense to me because EP does not give good results for a univariate Gaussian with Gamma precision. This implies that a VectorGaussian with Wishart precision would not work well either. A VectorGaussian with Wishart variance should work better, but again there are no papers on it.
 Marked as answer by Microsoft Research Friday, June 3, 2011 6:44 PM
Friday, June 3, 2011 6:44 PM 
shengbo replied on 04012011 12:23 PM
Thanks a lot for your quick reply and the information. It is very helpful!
Best regards,
ShengboFriday, June 3, 2011 6:44 PM