# High Hierarchy in Matrix Factorization Method. • ### Question

• Hi everyone

I'm working on a problem where i have to make the factorization of a matrix with biological meaning. I based my system on the Recommendation system(the one explained in the web site) to which i have tried to add some hierarchy. The system have one more 'layer' on order to allow it to became differently specific for each entry of the matrix. To be more clear this is the top of the model graph: as you can see, the Traits are sampled from multivariate Gaussian(VectorGaussian) where the means and the Precisions are specific for the row and the column item of the matrix and are sampled from a multivariate Gaussian and a Wishart distribution rispectively.

The code of the model is the following:

```int numRBPs = RBPs;
int numGenes = Genes;
int numTraits = numTrait;
Variable<int> numObservations = Variable.Observed(tmpGene.Length).Named("numObservations");
int numLevels = 1;

// Define ranges
Range RBP = new Range(numRBPs).Named("RBP");
Range gene = new Range(numGenes).Named("gene");
Range trait = new Range(numTraits).Named("trait");
Range observation = new Range(numObservations).Named("observation");
Range level = new Range(numLevels).Named("level");

// Define latent variables

var RBPTraits = Variable.Array<Vector>(RBP).Named("RBPTraits");
var geneTraits = Variable.Array<Vector>(gene).Named("geneTraits");
var RBPBias = Variable.Array<double>(RBP).Named("RBPBias");
var geneBias = Variable.Array<double>(gene).Named("geneBias");
var RBPThresholds = Variable.Array<double>(RBP).Named("RBPThresholds");

var RBPTraitsMean = Variable.Array<Vector>(RBP).Named("RBPTraitsMean");
var RBPTraitsPrec = Variable.Array<PositiveDefiniteMatrix>(RBP).Named("RBPTraitsPrec");
var geneTraitsMean = Variable.Array<Vector>(gene).Named("geneTraitsPrec");
var geneTraitsPrec = Variable.Array<PositiveDefiniteMatrix>(gene).Named("geneTraitsMean");

var RBPBiasMean = Variable.Array<double>(RBP).Named("RBPBiasMean");
var RBPBiasPrec = Variable.Array<double>(RBP).Named("RBPBiasPrec");
var geneBiasMean = Variable.Array<double>(gene).Named("geneBiasMean");
var geneBiasPrec = Variable.Array<double>(gene).Named("geneBiasPrec");

// Define priors

var RBPThresholdsPrior = Variable.Array<Gaussian>(RBP).Named("RBPThresholdsPrior");

var RBPTraitsPriorMean = Variable.Array<VectorGaussian>(RBP).Named("RBPTraitsPriorMean");
var RBPTraitsPriorPrec = Variable.Array<Wishart>(RBP).Named("RBPTraitsPriorPrec");

var geneTraitsPriorMean = Variable.Array<VectorGaussian>(gene).Named("geneTraitsPriorMean");
var geneTraitsPriorPrec = Variable.Array<Wishart>(gene).Named("geneTraitsPriorPrec");

var RBPbiasPriorMean = Variable.Array<Gaussian>(RBP).Named("RBPBiasPriorMean");
var RBPbiasPriorPrec = Variable.Array<Gamma>(RBP).Named("RBPBiasPriorPrec");

var geneBiasPriorMean = Variable.Array<Gaussian>(gene).Named("geneBiasPriorMean");
var geneBiasPriorPrec = Variable.Array<Gamma>(gene).Named("geneBiasPriorPrec");

// Define latent variables statistically

RBPTraitsMean[RBP] = Variable<Vector>.Random(RBPTraitsPriorMean[RBP]);
RBPTraitsPrec[RBP] = Variable<PositiveDefiniteMatrix>.Random(RBPTraitsPriorPrec[RBP]);
geneTraitsMean[gene] = Variable<Vector>.Random(geneTraitsPriorMean[gene]);
geneTraitsPrec[gene] = Variable<PositiveDefiniteMatrix>.Random(geneTraitsPriorPrec[gene]);

RBPBiasMean[RBP] = Variable<double>.Random(RBPbiasPriorMean[RBP]);
RBPBiasPrec[RBP] = Variable<double>.Random(RBPbiasPriorPrec[RBP]);
geneBiasMean[gene] = Variable<double>.Random(geneBiasPriorMean[gene]);
geneBiasPrec[gene] = Variable<double>.Random(geneBiasPriorPrec[gene]);

RBPTraits[RBP] = Variable.VectorGaussianFromMeanAndPrecision(RBPTraitsMean[RBP],
RBPTraitsPrec[RBP]);
geneTraits[gene] = Variable.VectorGaussianFromMeanAndPrecision(geneTraitsMean[gene],
geneTraitsPrec[gene]);

RBPBias[RBP] = Variable.GaussianFromMeanAndPrecision(RBPBiasMean[RBP], RBPBiasPrec[RBP]);

geneBias[gene] = Variable.GaussianFromMeanAndPrecision(geneBiasMean[gene], geneBiasPrec[gene]);

RBPThresholds[RBP] = Variable<double>.Random(RBPThresholdsPrior[RBP]);
/****************************************************************************/

// Initialise priors

VectorGaussian traitPriorMean = VectorGaussian.FromMeanAndPrecision(Vector.Zero(numTraits), PositiveDefiniteMatrix.IdentityScaledBy(numTraits, 10));
Wishart traitPriorPrecision = Wishart.FromShapeAndScale(numTraits, PositiveDefiniteMatrix.IdentityScaledBy(numTraits, 10));

Gaussian biasPriorMean = Gaussian.FromMeanAndVariance(0, 10);
Gamma biasPriorPrec = Gamma.FromShapeAndScale(4, 1);

/********* Create two matrices of distributions: one for the means and one for the precisions for all **********
********** the RBPs and genes Traits, for all the RBPs and gene bias                                  **********/

RBPTraitsPriorMean.ObservedValue = Util.ArrayInit(numRBPs, u => traitPriorMean);
RBPTraitsPriorPrec.ObservedValue = Util.ArrayInit(numRBPs, u => traitPriorPrecision);
geneTraitsPriorMean.ObservedValue = Util.ArrayInit(numGenes, i => traitPriorMean);
geneTraitsPriorPrec.ObservedValue = Util.ArrayInit(numGenes, i => traitPriorPrecision);

RBPbiasPriorMean.ObservedValue = Util.ArrayInit(numRBPs, u => biasPriorMean);
RBPbiasPriorPrec.ObservedValue = Util.ArrayInit(numRBPs, u => biasPriorPrec);

geneBiasPriorMean.ObservedValue = Util.ArrayInit(numGenes, i => biasPriorMean);
geneBiasPriorPrec.ObservedValue = Util.ArrayInit(numGenes, i => biasPriorPrec);                            	    /***Break Simmetry***/	    for (int i = 0; i < numRBPs; i++)            {                RBPTraitsPriorPrec.ObservedValue[i] = (Wishart.FromShapeAndScale(                                    numTraits, PositiveDefiniteMatrix.IdentityScaledBy(numTraits, Rand.Int(5, 21))));                RBPTraitsPriorMean.ObservedValue[i] = (VectorGaussian.FromMeanAndPrecision(                                                Vector.Zero(numTraits), PositiveDefiniteMatrix.IdentityScaledBy(numTraits, Rand.Int(5, 21))));                RBPbiasPriorMean.ObservedValue[i] = (Gaussian.FromMeanAndVariance(0, Rand.Int(5, 21)));                RBPbiasPriorPrec.ObservedValue[i] = (Gamma.FromShapeAndScale(4, 1));            }            for (int i = 0; i < numGenes; i++)            {                geneTraitsPriorPrec.ObservedValue[i] = (Wishart.FromShapeAndScale(                                                        numTraits, PositiveDefiniteMatrix.IdentityScaledBy(numTraits, Rand.Int(5, 21))));                geneTraitsPriorMean.ObservedValue[i] = (VectorGaussian.FromMeanAndPrecision(                                                 Vector.Zero(numTraits), PositiveDefiniteMatrix.IdentityScaledBy(numTraits, Rand.Int(5, 21))));                geneBiasPriorMean.ObservedValue[i] = (Gaussian.FromMeanAndVariance(0, Rand.Int(5, 21)));                geneBiasPriorPrec.ObservedValue[i] = (Gamma.FromShapeAndScale(4, 1));            }
RBPBiasMean[RBP].InitialiseTo(Gaussian.FromMeanAndVariance(0, Rand.Int(10, 21)));
RBPBiasPrec[RBP].InitialiseTo(Gamma.FromShapeAndScale(4, 1));
geneBiasMean[gene].InitialiseTo(Gaussian.PointMass(0));
geneBiasPrec[gene].InitialiseTo(Gamma.FromShapeAndScale(4, 1));

RBPThresholdsPrior.ObservedValue = Util.ArrayInit(numRBPs, u => Gaussian.FromMeanAndVariance(0, 1.0));

InferenceEngine engine = new InferenceEngine();
engine.Algorithm = new VariationalMessagePassing();
engine.NumberOfIterations = iteration;

// Set model noises explicitly

// Declare training data variables
var RBPData = Variable.Array<int>(observation).Named("RBPData");
var geneData = Variable.Array<int>(observation).Named("geneData");
var ratingData = Variable.Array<bool>(observation).Named("ratingData");

// Set model noises explicitly
Variable<double> affinityNoiseVariance = Variable.Observed(noiseVar).Named("affinityNoiseVariance");
Variable<double> thresholdsNoiseVariance = Variable.Observed(noiseVar).Named("thresholdsNoiseVariance");

// Model
using (Variable.ForEach(observation))
{
VariableArray<double> products = Variable.Array<double>(trait);//.Named("products");
var RBPfv = Variable.ArrayFromVector(RBPTraits[RBPData[observation]], trait);
var genefv = Variable.ArrayFromVector(geneTraits[geneData[observation]], trait);
products[trait] = RBPfv[trait] * genefv[trait];

Variable<double> bias = (RBPBias[RBPData[observation]] + geneBias[geneData[observation]]);//.Named("bias");
Variable<double> affinity = (bias + Variable.Sum(products));//.Named("productSum")).Named("affinity");
Variable<double> noisyAffinity = Variable.GaussianFromMeanAndVariance(affinity, affinityNoiseVariance);//.Named("noisyAffinity");

Variable<double> noisyThresholds = Variable.GaussianFromMeanAndVariance(RBPThresholds[RBPData[observation]], thresholdsNoiseVariance);
ratingData[observation] = noisyAffinity > noisyThresholds;
}

// Observe training data
GenerateData(numRBPs, numGenes, numTraits, numObservations.ObservedValue, numLevels,
RBPData, geneData, ratingData, tmpRBP, tmpGene, tmpRating);

// Allow EP to process the product factor as if running VMP
// as in Stern, Herbrich, Graepel paper.
engine.Compiler.GivePriorityTo(typeof(GaussianProductOp_SHG09));
engine.Compiler.ShowWarnings = true;
//engine.ShowFactorGraph = true;

observation.AddAttribute(new Sequential());  // needed to get stable convergence
engine.Compiler.UseSerialSchedules = true;

// Run inference

var RBPTraitsMeanPosterior = engine.Infer<VectorGaussian[]>(RBPTraitsMean);
var RBPTraitsPrecPosterior = engine.Infer<Wishart[]>(RBPTraitsPrec);
var geneTraitsMeanPosterior = engine.Infer<VectorGaussian[]>(geneTraitsMean);
var geneTraitsPrecPosterior = engine.Infer<Wishart[]>(geneTraitsPrec);

var RBPbiasMeanPosterior = engine.Infer<Gaussian[]>(RBPBiasMean);
var RBPbiasPrecPosterior = engine.Infer<Gamma[]>(RBPBiasPrec);

var geneBiasMeanPosterior = engine.Infer<Gaussian[]>(geneBiasMean);
var geneBiasPrecPosterior = engine.Infer<Gamma[]>(geneBiasPrec);

var RBPThresholdsPosterior = engine.Infer<Gaussian[]>(RBPThresholds);

// Feed in the inferred posteriors as the new priors
RBPTraitsPriorMean.ObservedValue = RBPTraitsMeanPosterior;
RBPTraitsPriorPrec.ObservedValue = RBPTraitsPrecPosterior;
geneTraitsPriorMean.ObservedValue = geneTraitsMeanPosterior;
geneTraitsPriorPrec.ObservedValue = geneTraitsPrecPosterior;

RBPbiasPriorMean.ObservedValue = RBPbiasMeanPosterior;
RBPbiasPriorPrec.ObservedValue = RBPbiasPrecPosterior;

geneBiasPriorMean.ObservedValue = geneBiasMeanPosterior;
geneBiasPriorPrec.ObservedValue = geneBiasPrecPosterior;

RBPThresholdsPrior.ObservedValue = RBPThresholdsPosterior;

int[] RBPsTestHT;
int[] genesTestHT;
int[] testHTRatings;

LoadData(testHTFilename, out RBPsTestHT, out genesTestHT, out testHTRatings);

numObservations.ObservedValue = RBPsTestHT.Length;

RBPData.ObservedValue = RBPsTestHT;
geneData.ObservedValue = genesTestHT;
ratingData.ClearObservedValue();

Bernoulli[] predictedRatings = engine.Infer<Bernoulli[]>(ratingData);

```

This definition is a little bit different from the Recommender System example even if the 'core' of the object remain the same. In the code you can see that i initialize the 'RBPBiasMean, RBPBiasPrec, geneBiasMean, geneBiasPrec' in order to avoid Improper distribution messages and i break the simmetry of the model intializing the ' RBPTraitsPriorPrec,  RBPTraitsPriorMean,  RBPbiasPriorMean,  RBPbiasPriorPrec,  geneTraitsPriorPrec,  geneTraitsPriorMean, , geneBiasPriorMean,  geneBiasPriorPrec'.

Therefore, since i use multivariate Gaussian distrubution sampled from the Wishart, i change the inference alg from EP to VMP.

Finally, in order to be able to make prediction on some of the matrix entry I make the inference on all the prior values ( RBPTraitsPriorPrec,  RBPTraitsPriorMean,  RBPbiasPriorMean,  RBPbiasPriorPrec,  geneTraitsPriorPrec,  geneTraitsPriorMean, , geneBiasPriorMean,  geneBiasPriorPrec, RBPbiasPriorMean, RBPbiasPriorPrec, geneBiasPriorMean, geneBiasPriorPrec ) and these became the new prior for the inference.

The problem that born is the following. When i run the model over real value, the predicion made after the inference is the same for every points of the matrix. In other words, any elements i use for the evaluation of the model( prediction phase) the bernoulli value for each of tham is the same(thing that doesn't happens if I don't introduce the new Hiearchical layer). I'm suspecting that i'm doing something wrong, but I don't know where. I'm not able to understand where the problem is and what theoretical concepts I'm violating.

Therefore, as not expected, where the system infer the Bernoulli for the prediction, the model iterate onother time, that is:

Compiling model...done

Iterating:

.1

Compiling model...done

Iterating:

.1

While this doesn't happens when in the original model i used the EP Alg.

Can Someone help me?? I really need to make this system works since it is part of my master Project. Please help me.

Bests

Marco

• Edited by Sunday, June 30, 2013 8:46 AM
Sunday, June 30, 2013 8:45 AM

### All replies

• I tried your code with the following settings:

```          int numRBPs = 2;
int numGenes = 2;
int numTraits = 1;
...
// Observe training data
numObservations.ObservedValue = 4;
RBPData.ObservedValue = new int[] { 0, 0, 1, 1 };
geneData.ObservedValue = new int[] { 0, 1, 0, 1 };
ratingData.ObservedValue = new bool[] { true, false, true, false };
...
// Test data
numObservations.ObservedValue = 2;
RBPData.ObservedValue = new int[] { 0, 1 };
geneData.ObservedValue = new int[] { 0, 1 };
ratingData.ClearObservedValue();
```

and I got the following results:

 Bernoulli(0.9429)
 Bernoulli(0.05442)

These values are different.  So I'm not clear on what the problem is.

Thursday, July 4, 2013 4:59 PM
• Hi Marco,

The goal of such hierarchal priors is usually to learn community parameters. That is, commonly the priors are shared across users or items. So I was wondering what are you trying to achieve in your model?

Cheers,
Yordan

Friday, July 5, 2013 9:22 AM
• I'm really surprised about these results and trying with the data you post i get :

Bernoulli(0.5)

Bernoulli(0.5)

Did you make some other modifications to the code??

Friday, July 5, 2013 10:02 AM
• Hi Yordan

Exactly as you said. The goal of this hierarchical prior is to let the system to learn different parameters for different user/items. What happens in the original case is that all the latent vectors are sampled from the same distribution(with the same mean and variance). However this is in some way reductive because we assume that the prior over all the latent features are the same for all the features for all the user/items. Including a hierarchical prior let the system to differentiate the priors for all the user/items according to the data. In this way, in addition to differentiate the item/users prior, this let the system to sample latent vector to similar prior for similar user/items. What i was trying to do is exaclty this. Since some of the items in my dataset( proteins to be more clear) reach better results with different traits priors parameters while other items get worst results, adding a hierarchical prior should let the system to learn the better parameters for all the items. (Please correct me if i'm wrong)

By the way, even if with data suggested in the previous post, i get the same error, that is the values of the prediction are always the same(at least in my computer). May i eventually post the complete Project as .zip in order to let you to try the system completely???

Marco

Friday, July 5, 2013 10:44 AM
• To get my result, I deleted the line that set

engine.NumberOfIterations = iteration;

since you didn't specify what 'iteration' should be.  When I set this to 1, I get the 0.5 solution that you posted.  So this may be the problem.

Friday, July 5, 2013 11:50 AM
• Can you check what the target .NET runtime you are using? (Go to project properties). If is set to '.NET Framework 4.5', can you try setting it to '.NET Framework 4.0 Client'?

John

Friday, July 5, 2013 12:07 PM
• Ok, using an higher number of iterations for the VPM algorithm the results change(at least in the toy example) and i got different( and better) results(even if different from yours and worst). The problem is that with a number of iterations higher that 1 i got the following error:

Improper distribution during inference (Gaussian(m/v=-0,3858, 1/v=-0,05673)).  Cannot perform inference on this model.

As suggested in other post in this forum i have tried to initialize the variable involved and using larger priors but the problem still remain. Any suggestion to solve this other problem?

The reason for the number of iteration limits might be the fact that the model requires a number of iterations bigger than one in order to menage the dataset(the more hierarchy level may require it). However now i don't know how to solve this new problem.

Any suggestion to solve this other problem?

For what concern the target .NET runtime, i'm using the .NET Framework 4.0.

Friday, July 5, 2013 1:47 PM
• An improper distribution should never arise in this model with VMP.  By following the instructions in Debugging inference you can get the Visual Studio debugger to show you which line in the generated code threw the exception.  Then you can work backward to find out where the improper distribution came from.
Friday, July 5, 2013 5:32 PM
• Thank for your reply. I'm sure about this, but i got this kind of error also using the VMP algorithm. The code is exactly the reported in the first post. With this parameters i got the following error:

Improper distribution during inference (Gaussian(m/v=-0,5189, 1/v=-0,02762)).  Cannot perform inference on this model.

at the 4th iteration of the algorithm. I have tried to use the Visual Studio debugger in order to understand where the problem came from and the problem seems to come from the generated code line:

// Message to 'vdouble46' from IsPositive factor
this.vdouble46_B[observation] = IsPositiveOp.XAverageLogarithm(this.RatingData[observation], vdouble46_F[observation], this.vdouble46_B[observation]);

Looking now at the model Graph, the message vdouble46 is the one that compare the affinity value(already noised) with the rating value. Its definition and initialization in the generated code are:

// Create array for replicates of 'vdouble46_B'
this.vdouble46_B = new DistributionStructArray<Gaussian,double>(this.NumObservations);

...

for(int observation = 0; observation<this.NumObservations; observation++) {
this.vdouble46_B[observation] = Gaussian.Uniform(); Even if with your advice to use the debug i'm not able to find the reason because this improper distribution and I don't know how to find the origin.

If you would like to try with the complete model and with the dataset who I'm working with, i can shared it here, in order to let you to see the exact error.

Marco

Monday, July 8, 2013 1:29 PM
• You can post here or send it to us via email at infersup@microsoft.com.
Monday, July 8, 2013 1:56 PM
• Thanks Tom for the information.

I just sent the project to the email address you wrote above because i'm not able to post it here. You can just unzip the file and run the project to get the error.

Thank you again for your time

Marco

Monday, July 8, 2013 2:26 PM
• Thanks for the code.  This is a bug in Infer.NET, arising from the comparison of two random variables in the expression (noisyAffinity > noisyThresholds).  Since you have only one level, you can replace noisyThresholds with 0 (or any other constant) and then it works fine.
Monday, July 8, 2013 3:08 PM
• Ok, thanks. But using this solution i lost the difference in threshold value for each RBP (users in the original case). Isn't it? In other words, using only a value for the noisy thresholds, is something like that all users evaluates the films in the same way, without differentiating from user to user. Is this true or there is something wrong in my reasoning??

Thanks a lot for your time. I really appreciate!

Monday, July 8, 2013 3:19 PM
• The bias should already handle this.  Looking at the model, RBPBias and RBPThresholds have the same effect.
Monday, July 8, 2013 3:22 PM