none
Different behavior when running in debugger

    Question

  • I am running an inference that fails whenever I run it outside of the debugger but always succeeds when I run it in the debugger.  I can fix it by broadening my priors but I'm curious as to why this should occur.  The error I get is:

    "Improper distribution during inference (Gaussian(m/v=-3.145e+05, 1/v=-296.5)).  Cannot perform inference on this model."


    Oopsnut
    Friday, January 20, 2012 6:15 AM

All replies

  • Normally C# code should give the same answer whether you are in the debugger or not, but sometimes the answers change if you are using multiple threads and the code is not thread-safe.
    Friday, January 20, 2012 4:32 PM