locked
HPC Pack 2012 R2 - Unable to run MPI jobs across nodes RRS feed

  • Question

  • I have a 4 node HPC cluster setup and we are unable to submit MPI jobs across multiple nodes.  When we try the job will just hang and 0% 

    Any ideas on how to troubleshoot would be great.  Thanks.

    Thursday, April 28, 2016 8:30 PM

All replies

  • Have you tried the built-in MPI pingpong? -- You can check the diagnostics MPI Ping-Pong Test from the admin console.

    What's your task commandline?


    Qiufang Shi

    Friday, April 29, 2016 12:41 AM