locked
Problem running (chain) model on 2.6; previously worked on 2.5 RRS feed

  • Question

  • When running a previously working model from 2.5, I'm seeing the following error. Any pointers as to how to resolve this? I'm not sure what the error means, as I'm simply defining a Poisson distribution with a Gamma prior.

    The model below is also a chain model where I previously defined the VariableArrays in each step separately with a for loop. I'm happy to see the new support for more efficient inference in chain/grid models, but I have a couple of questions:

    • What do we need to do specifically to set up this more efficient inference? Is it just with a 2D VariableArray and the range.addAttribute(sequential) line?
    • Normally when we set up these models, not all of the data has all of the observations in the model. It seems that in the ChessAnalysis example (and in my previous implementation) we are just filling the unobserved parts of the 2D VariableArray with meaningless values. Does this slow down the inference, and is there any way to get around it? It seems like it would make data with a long tail run slower because there are just a few observations with many steps. (When I tried removing this, I would get errors about variables in certain ranges being undefined.)

    Thanks,

    Andrew

    EDIT: I just discovered the new post on offset indexing in the documentation, for the first point above.





    • Edited by Andrew Mao Tuesday, December 9, 2014 8:25 PM
    Tuesday, December 9, 2014 6:15 PM

All replies

  • The error message and model isn't showing up in your post.  As for the unused parts of arrays, these don't affect the runtime since they are never accessed (except during initialization), but they do affect the memory footprint.  You can always rewrite the model to not have unused elements, it's just a matter of how clean you want the code to look.
    Wednesday, December 10, 2014 12:10 PM
    Owner
  • That's weird, I uploaded the picture through the forum and it shows up in my browser. Perhaps it was a temporary hiccup from the server that serves the images.

    The error states the following:

    GenericArguments[0], 'MicrosoftResearch.Infer.Distributions.Poisson', on 'TDist SetPoint[TDist,T](TDist, T)' violates the constraint of type 'TDist'.

    This might not be that important, as I'm in the process of converting this model to the newer format with offset indexing.

    Regarding your last point - does using jagged VariableArrays still allow for efficient message passing/inference, and could you provide an example? A common case in my data is that the average observation length is 10 values, but a very small fraction might go up to 100 or 1000. Using the 2D VariableArrays would hence result in almost 10 or 100 times as much memory as necessary. Running this model previously used to take over 8GB of RAM, so it seems like optimizing that part could reduce memory use significantly.

    Wednesday, December 10, 2014 11:48 PM