locked
force sparsity in jagged array RRS feed

  • Question

  • Hi

    I have to put a constraint on my 2d jagged array, so that it remains sparse during the inference process.

    I just used a Laplace prior for my 2d jagged array like this:

    Variable<Gamma> variancePrior = Gamma.FromShapeAndRate(1.0, 1.0);
    tdjaggedArray[iRange][jRange] = Variable.
    
    GaussianFromMeanAndVariance(0, Variable<double>.Random(variancePrior)).ForEach(iRange, jRange); 
    but it is not working. The result 2d jarray is not sparse.

    Is there any way of enforcing this?

    I thought for example I can count the non-zero elements and put a threshold on this count (I have no idea to implement this also).

    Is there any way at all to enforce this sparsity condition?

    THANKS

    Tuesday, September 2, 2014 1:37 PM

Answers

  • From your description, you do not want sparsity during inference, you just want a prior that encourages sparsity.  The code you posted already does this.  If the results are not good, then there is probably some other issue in the model, such as symmetries or a lot of uncertainty, which leads to the non-sparse results.
    • Marked as answer by Capli19 Tuesday, September 2, 2014 5:32 PM
    Tuesday, September 2, 2014 5:19 PM
    Owner
  • It is beyond the scope of this forum to tell you what model you should use.  Infer.NET can handle a variety of sparsity priors including Laplace distribution (as you have done) or spike-and-slab.
    • Marked as answer by Capli19 Friday, October 3, 2014 2:11 PM
    Friday, October 3, 2014 1:49 PM
    Owner
  • OK thank you very much :)
    • Marked as answer by Capli19 Friday, October 3, 2014 2:12 PM
    Friday, October 3, 2014 2:07 PM

All replies

  • Sparsity during inference is only supported for Discrete and Dirichlet distributions at the moment.  See Adding attributes to your model.  Why do you need sparsity during inference?  Is it only because you want things to run faster?
    Tuesday, September 2, 2014 4:05 PM
    Owner
  • Actually I have this 2d jagged array which weights some attributes for each entity in my model. but I don't know how many attribute each entity has. I don't have any prior information about the link between attributes and entities. So I use this 2d array as weights which determines which attribute has effect on which entity, and what is the weight of this connection.

    The only information that I have is that, there are not many connections. Like if we consider them as a graph, this would be a sparse graph. Currently the weight is somehow divided on many edges, but I thought maybe I can put some constraint that shows I don't want many non-zero elements in my 2d jagged array.

    I put this double-exponential prior on w, so that it enforces w to take values near zero in most of cases. Do you think it is a solution for the problem?
    • Edited by Capli19 Tuesday, September 2, 2014 4:29 PM
    Tuesday, September 2, 2014 4:28 PM
  • From your description, you do not want sparsity during inference, you just want a prior that encourages sparsity.  The code you posted already does this.  If the results are not good, then there is probably some other issue in the model, such as symmetries or a lot of uncertainty, which leads to the non-sparse results.
    • Marked as answer by Capli19 Tuesday, September 2, 2014 5:32 PM
    Tuesday, September 2, 2014 5:19 PM
    Owner
  • Hi Tom. I have still this problem with the sparsity of the model. I need a model selector which chooses sparse number of candidate explanatory variables(For example I have Yi=f(Xij), for example Yi=Xij.Wj + error). For each Yi there are many Xij variables in the model. In the posterior distribution I need many Xij s to be zero. If I have a laplace prior on on X s that would be the case, but using a prior as above, in posterior I get many W s which are not zero. I need instead a few W s with high value and the rest of W s  very close to zero or exactly zero.

    How can I overcome this? Do you think implementing a Laplace distribution helps in this case? So that I put a laplace prior on w, and therefore I get a really sparse posterior?

    Monday, September 29, 2014 3:16 PM
  • If the posterior distribution says that W is not close to zero, why would you want to force some of them to be zero?  Either you don't believe in the model you have chosen or you don't trust the results of inference.  Which is it?
    Monday, September 29, 2014 3:36 PM
    Owner
  • Thanks for your reply and excuse me for replying late. I "know" that my model should be sparse. I have generated the data myself. I need to infer edges in a sparse (gene or protein interaction network for example) network.

    each node's value should depend on value of a few other nodes. instead of getting a few strong links I get many weak links ( or sometimes many strong links multiplied by weak coefficients ).

    I read this paper:

    A Review of Bayesian Variable Selection Methods: What, How and Which -    By: R.B. O'Hara and M. J. SillanpÄaÄ

    * the section about sparseness

    Here in this paper, I could see methods to enforce these values to be sparse.

    The Laplace shrinkage method does not work here for me because of the issue of not having a really Laplace distribution I think (am I right?).

    I think my problem is that I have a Gaussian prior (although it is a Gaussian with Gamma variance which should makes a Laplace).

    I thought maybe I can use the "Stochastic search variable selection" method mentioned in the paper.

    I have no idea still. Maybe I can use a bionomial distributed variable A for example and choose between spike and slab distribution for each variable (for each edge weight).

    Then when spike have a shorter probability I just consider this edge does not exist in the final network.

    I would be thankful if you let me know what you think about this.
    Friday, October 3, 2014 1:34 PM
  • It is beyond the scope of this forum to tell you what model you should use.  Infer.NET can handle a variety of sparsity priors including Laplace distribution (as you have done) or spike-and-slab.
    • Marked as answer by Capli19 Friday, October 3, 2014 2:11 PM
    Friday, October 3, 2014 1:49 PM
    Owner
  • OK thank you very much :)
    • Marked as answer by Capli19 Friday, October 3, 2014 2:12 PM
    Friday, October 3, 2014 2:07 PM
  • Tom, Can you please introduce a forum which is suitable for posting this question? I posted on http://stats.stackexchange.com but for a long time I got no reply. Or can you introduce another resource from which I find out what should I do? Thanks a lot.
    Thursday, October 9, 2014 8:34 PM