# Bayesian PCA (Migrated from community.research.microsoft.com) • ### Question

• shiv posted on 06-14-2010 10:44 AM

The Bayesian PCA example has been very useful for me in learning infer.NET. I have two questions related to it:

1. In the example, the covariance matrix for observations is a diagonal matrix. For PCA, should it not be an isotropic covariance matrix?

2. I am trying to experiment with a latent variable model, similar to Bayesian PCA but with full covariance matrix for observations. The following is my attempt:

// W
vAlpha = Variable.Array<double>(rM).Named("Alpha");
vAlpha[rM] = Variable.Random<double, Gamma>(priorAlpha).ForEach(rM);
vW = Variable.Array<Vector>(rM).Named("W");
vW[rM] = Variable.VectorGaussianFromMeanAndPrecision(new Vector(Convert.ToInt32(rD)), PositiveDefiniteMatrix.IdentityScaledBy(Convert.ToInt32(rD), Convert.ToDouble(vAlpha[rM]))).ForEach(rM);

// Z
vZ = Variable.Array<Vector>(rN).Named("Z");
vZ[rN] = Variable.VectorGaussianFromMeanAndPrecision(new Vector(Convert.ToInt32(rM)),
PositiveDefiniteMatrix.Identity(Convert.ToInt32(rM))).ForEach(rN);

// Mixing
vT = Variable.Array<Vector>(rN).Named("T");
vT[rN] = (vZ[rN]*vW).forEach(rN);

// Bias
vMu = Variable.VectorGaussianFromMeanAndPrecision(new Vector(Convert.ToInt32(rD)),
PositiveDefiniteMatrix.IdentityScaledBy(Convert.ToInt32(rD), 0.01)).Named("mu");
vU[rN] = vT[rN] + vMu;

// Noise
vPi = Variable.WishartFromShapeAndScale(100.0, PositiveDefiniteMatrix.IdentityScaledBy(Convert.ToInt32(rD), 0.01)).Named("pi");

// Data
vData[rN] = Variable.VectorGaussianFromMeanAndPrecision(vU[rN], vPi);

The compiler shows an error in the line for estimating vT. Could you please help me with this error, or perhaps let me know a better way to write code for this problem.

Thanks

Shiv

Friday, June 3, 2011 5:46 PM

• shiv replied on 06-16-2010 10:16 AM

Hello John:

Thanks for the prompt clarification. I will get back to you if I face any problem

Shiv

Friday, June 3, 2011 5:46 PM

### All replies

• John Guiver replied on 06-15-2010 10:35 AM

Hi Shiv.

The answer to 1 is yes.

One way of doing 2 is shown below. Note the following key points, which are also commented in the code.

1. vW, vZ etc and the data need to be in the form of arrays of arrays rather than double arrays (because of (3) below). This will also require you to asjust your code for extracting marginals and for initialisation (let me know if you run into difficulty)
2. Because these are arrays of arrays, we cannot use the matrix multiply, and instead must do the matrix multiply explicitly
3. At the point where we want to inject the full Wishart-distributed noise, we need to use a factor which converts an array to a vector
4. The prior for the noise variable is now a Wishart

// W
vAlpha = Variable.Array<double>(rM).Named("Alpha");
vW = Variable.Array(Variable.Array<double>(rD), rM).Named("W");
vAlpha[rM] = Variable.Random<double, Gamma>(priorAlpha).ForEach(rM);
vW[rM][rD] = Variable.GaussianFromMeanAndPrecision(0, vAlpha[rM]).ForEach(rD);

// Z
vZ = Variable.Array(Variable.Array<double>(rM), rN).Named("Z");
vZ[rN][rM] = Variable.GaussianFromMeanAndPrecision(0.0, 1.0).ForEach(rN, rM);
vT = Variable.Array(Variable.Array<double>(rD), rN).Named("T");

// Explicit Matrix multiply
using (Variable.ForEach(rN))
{
using (Variable.ForEach(rD))
{
var vProds = Variable.Array<double>(rM);
vProds[rM] = vZ[rN][rM] * vW[rM][rD];
vT[rN][rD] = Variable.Sum(vProds);
}
}

// Bias
vMu = Variable.Array<double>(rD).Named("mu");
vMu[rD] = Variable.Random<double, Gaussian>(priorMu).ForEach(rD);
vU = Variable.Array(Variable.Array<double>(rD), rN).Named("U");
vU[rN][rD] = vT[rN][rD] + vMu[rD];

// Convert variable array of variable arrays to variable array of Vector random variables
var vUVec = Variable.Array<Vector>(rN);
vUVec[rN] = Variable<Vector>.Factor(Vector.FromArray, vU[rN]);

// Noise
vPi = Variable.Random<PositiveDefiniteMatrix,Wishart>(priorPi).Named("pi");

// Data
vData[rN] = Variable.VectorGaussianFromMeanAndPrecision(vUVec[rN], vPi);

Hope this helps

John

Friday, June 3, 2011 5:46 PM
• shiv replied on 06-16-2010 10:16 AM

Hello John:

Thanks for the prompt clarification. I will get back to you if I face any problem

Shiv

Friday, June 3, 2011 5:46 PM