locked
How to queue a serial job using cluster manager RRS feed

  • Question

  • Hi All,

           I am a new user of Microsoft HPC and well as I have to maintain the server myself.  Currently I want to automated my Computational Fluid Dynamic (CFD) job by using powershell and script.  My scenarios are, I have only one license for my CFD software.  When I do a preprocessing job, I can use only one core out of my 96 cores.  Now the problem is, when I submit my batch file, the script will keep running after it submit the job to the cluster manager hence this has cause problem because I need the result from the computation as my input for my next line such as copy back the result.  Please help.

    Rgds,

    Leong

    Thursday, November 18, 2010 8:09 AM

Answers

All replies

  • Hi Leong,

    From your description I understand, that you want your script to wait for a job to finish before going to the next step. I think there are different possible soutions:

    1. You can monitor job's state with Powershell script or small application written with use of Job Scheduler API. Script or your application will exit when job is in final state.

    2. You can convert your separate preprocessing job and other jobs, which are submitted after it is finished, into a single job's tasks. Then you can set dependencies between them, so they will execute in correct order.

    Please let me know if I correctly understood your issue and if you have any further questions.

    Thank you,
    Łukasz

    Thursday, November 18, 2010 9:37 PM
  • Hi Lukasz,

               Thank you for replying.  Yes.  You are right.

    1. You can monitor job's state with Powershell script or small application written with use of Job Scheduler API. Script or your application will exit when job is in final state.

      I am learning Powershell and VB at the moment so that I can manipulate the job scheduling with PSH or VB.

     

    2. You can convert your separate preprocessing job and other jobs, which are submitted after it is finished, into a single job's tasks. Then you can set dependencies between them, so they will execute in correct order.

        At the moment I do combine all my work process into a single script but it will ideal if I can just simply send a single job (complete process) to my cluster once my initial preparation is done.  What I am trying to say is, let say I have ten design to validate, I prefer to send the model one by one to the server so that I can make the job running as soon as I finish prepare the first one.

        I also aware of the dependencies feature.  Mind tell me how to set dependency on my job in Microsoft Windows HPC 2008 R2 Cluster Manager?

     

    Rgds,

    Leong

    Friday, November 19, 2010 1:55 AM
  •     I also aware of the dependencies feature.  Mind tell me how to set dependency on my job in Microsoft Windows HPC 2008 R2 Cluster Manager?

    Here's a little instruction:

    http://technet.microsoft.com/en-us/library/cc972816(WS.10).aspx

    Let me know if you have any other questions.

    Regards,
    Łukasz

    Friday, November 19, 2010 2:43 PM
  • Hi Lukasz,

                Thank you very much.  Appreciate!

     

    Rgds,

    Leong

    Tuesday, November 23, 2010 1:16 AM