SCHELLING COORDINATION GAME (from Anglican web site) RRS feed

  • Question

  • I'm Trying to solve another Anglican problem:

    I came up with this:

    let bob=Variable.Bernoulli(0.6) //pub=true
    let amy=Variable.Bernoulli(0.6)
    let ie=new InferenceEngine()
    printfn "bob=%A - amy=%A" (ie.Infer(bob)) (ie.Infer(amy))

    Which give as result:
    bob=Bernoulli(0.6923) - amy=Bernoulli(0.6923)
    whereas I expected: Bernoulli(0.52)

    Then to see if my understanding of how library is working, I tested this mall program:

    let bob=Variable.Bernoulli(0.6) //pub=true
    let amy=Variable.Bernoulli(0.6)
    let ie=new InferenceEngine()
    printfn "bob=%A" (ie.Infer(bob==false &&& amy==false))

    and I've got: Bernoulli(0.76)
    whereas I expected: Bernoulli(0.16)

    Clearly there something I'm not getting right.
    Any suggestion would be appreciated.
    Thank You
    Wednesday, October 4, 2017 4:30 PM

All replies

  • In the first case, there are only two possibilities: (T,T) and (F,F).  These have prior probability 0.6^2 and 0.4^2.  Normalizing to get the posterior gives 0.6^2/(0.6^2 + 0.4^2) = 0.6923.

    For the second case, I do get Bernoulli(0.16).

    Wednesday, October 4, 2017 5:18 PM
  • This means that when I write:
    I'm constraining the "world".
    In other words when I infer the probability of an event this probability is normalized with respect of this "constrained world".
    Is this assumption correct?

    Regarding the second point I've done many tests and indeed I'm still getting Bernoulli(0.76).

    (I tried to attach a picture but I'm not allow so here the photobucket link:



     I suspect this is due to some wrong overlay of the == operator.

    This is the environment I've tested with:
    - Infer.Net 2.6.41114.1
    1. VS2017
        - .NET Framework 4.6.1
        - Target F# Runtime:
    2. VS2015
        - .NET Framework 4.5.2
        - Target F# Runtime:

    Once again thank You for Your time.

    Wednesday, October 4, 2017 7:31 PM
  • if you let 

    meet = bob == amy;

    then ie.Infer(meet) gives you Bernoulli(0.52)

    This is Pr(Bob meets Amy) wheareas I think your first approach calculates Pr(Bob/Amy at Pub | Bob meets Amy)
    • Edited by cyentist Wednesday, October 4, 2017 7:50 PM
    Wednesday, October 4, 2017 7:46 PM
  • Just FYI: same code in C# (same setup in VS2017 as for F#) gives correct result of Bernoulli(0.16)
    Wednesday, October 4, 2017 7:51 PM
  • @cyentist, Thank You.

    It works and seems to confirm that the Variable.Constrain(...) statement is "reducing" the total "world" that gives the total 100% probability.

    I'm I correct?


    Wednesday, October 4, 2017 7:59 PM
  • I found the problem in my code:
    need parenthesis around conditions like this:
    (bob==false) &&& (amy==false)
    Don't know if this is conform to language specification or not.
    Nevertheless to me it was confusing (if You are working on a deterministic program this is a bug easy to find, but in a probabilistic program it can become quite difficult to find out).

    Here the code:

    let bob=Variable.Bernoulli(0.6) //pub=true
    let amy=Variable.Bernoulli(0.6)
    let ie=new InferenceEngine()
    printfn "bob=%A" (ie.Infer(bob==false &&& amy==false))
    //THIS WORK!!!!!!
    printfn "bob=%A" (ie.Infer( (bob==false) &&& (amy==false) ))

    Thursday, October 5, 2017 7:45 AM
  • Thanks, I didn't know about that oddity of F#.  Your understanding of Variable.Constrain is correct.
    Thursday, October 5, 2017 12:04 PM
  • @Alessandro Rizzotto Have you tried to go further and implement the model with recursion and false beliefs? 
    Sunday, October 8, 2017 7:51 AM
  • @ usptact, I've not had time to investigate further yet.
    I'm still familiarizing with the library and i'm not quite sure if recursion is supported (recursion is obviously supported in F#, but I wonder if I can use a distribution probability inside my recursive function call like in Anglican).
    I'm quite interested in exploring further this possibility.
    Do You have any experience in this respect?
    Monday, October 9, 2017 7:37 PM
  • @cyentist, finally after playing a little with the library, I realized what You have pointed out:
    is just conditioning in "Bayes" sense (whereas before I thought the the only way to "conditioning" probabilities was OBSERVING a variable).
    Thank You
    Monday, October 9, 2017 7:57 PM
  • I am afraid that I am not experienced with Infer.NET and these models enough. I am still learning the applications of the library in parallel to my studies of graphical models. In those relatively brief moments free of work, I explore various models out of curiosity. My ultimate goal is to understand probabilistic programming by example and to be able to apply to my problems (Natural Language Processing now and other applications perhaps later).
    Tuesday, October 10, 2017 12:18 AM