John Guiver replied on 05-03-2011 10:37 AM
Hi Shengbo
You will need to implement the shared variable pattern explicitly rather than use the shared variable wrapper classes. The general pattern for shared variables is described in http://research.microsoft.com/en-us/um/cambridge/projects/infernet/docs/Sharing%20variables%20between%20models.aspx.
To paraphrase:
- Run inference to convergence on one of the sub-models, say model/chunk A
- Extract the shared variable messages output from model/chunk A
- Initialise the next model, model/chunk B, say, by providing as input
a product of all the output messages from all models/chunks except B (along with the prior)
Note that you want the output message rather than the marginal from the compiled algorithm (at least if you are running Expectation Propagation); to generate code for this output message, you need to set an Output attribute on the shared variable in your
original code (see
http://research.microsoft.com/en-us/um/cambridge/projects/infernet/docs/Adding%20attributes%20to%20your%20model.aspx).
There are several tricky aspects to this which govern how you should define your model. I think the first thing would be to understand how a simple scalar example works. I give code for a learning a Gaussian example below. Note that I am not using
precompiled code in this example - its only purpose is to show you the shared variable pattern. The extension from scalar to array follows exactly the same pattern but is a bit more tricky syntactically; I can help you there in a follow-up post
if necessary, depending on your needs.
John
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using MicrosoftResearch.Infer.Models;
using MicrosoftResearch.Infer.Distributions;
using MicrosoftResearch.Infer;
namespace sharedvar_with_precompiled
{
class Program
{
static void Main(string[] args)
{
var numData =
Variable.New<int>();
Range n = new
Range(numData);
var input = Variable.New<Gaussian>();
var a = Variable.Random<double,
Gaussian>(input).Attrib(new
Output());
var d = Variable.Array<double>(n);
d[n] = Variable.GaussianFromMeanAndPrecision(a, 1.0).ForEach(n);
var engine = new
InferenceEngine();
// Prior
var Prior = Gaussian.FromMeanAndPrecision(0.1, 0.2);
// Data chunks
double [][] data =
new double[][]
{
new double[] {1.0, 2.0, 3.0},
new double[] {1.5, 2.5},
new double[] {2.1, 2.2, 2.3}
};
int numChunks = data.Length;
// Start with uniform output messages
Gaussian[] outputs =
new Gaussian[numChunks];
for (int i = 0; i < numChunks; i++) outputs[i] =
Gaussian.Uniform();
int numPasses = 2;
for (int p = 0; p < numPasses; p++)
{
for (int i = 0; i < numChunks; i++)
{
// Multiply the prior with all output messages from other chunks
Gaussian inputValue = (Gaussian)Prior.Clone();
inputValue = Distribution.SetToProductWithAllExcept(inputValue, outputs, i);
input.ObservedValue = inputValue;
d.ObservedValue = data[i];
numData.ObservedValue = data[i].Length;
if (engine.Algorithm
is ExpectationPropagation)
outputs[i] = engine.GetOutputMessage<Gaussian>(a);
else
{
outputs[i] = engine.Infer<Gaussian>(a);
outputs[i].SetToRatio(outputs[i], inputValue);
}
}
}
Console.WriteLine("Posterior = {0}", engine.Infer<Gaussian>(a));
}
}
}