locked
Submitting Jobs with file paths in command string RRS feed

  • Question

  • Hi all. I have briefly searched this forum for a similar question, but I haven't found anything that answers my question. I could be using the wrong vocabulary/jargon and missed a thread that already asked this, if so I apologize.

    We are relatively new to the Windows HPC environment, so this is a noob question. I have a program that I wish to run on our worker nodes, I have already confirmed that the EXE can run via command line on a worker node with the exe residing in a shared folder off the head. 

    Problem: The program needs a filename to be fed as a parameter, and 'input file' that it needs to process. The input files are not named in a fashion that would allow me to use the parametric sweep job, and can not be renamed to allow that. What is the best/easiest way to do this in Windows HPC?

    Current solution: Use Powershell to get a list of the files from the network share off the NAS system, then for each file, create a new HPC job building the correct command string with the path of the file and submit. 

    I just wanted to ask if there was an easier/better practice method to do this before investing the time to learn Powershell. Thanks


    Tuesday, December 11, 2012 5:51 PM

Answers

  • How about:

    Create new job

    job new - (output:<jobId>)

    Add tasks with input/ouput file as parameter

    job add <jobId> [/stdin:path\input_file1.dat] [/stdout:path\outpuf_file1.dat] app.exe

    Run job on cluster

    job submit [/Id:jobId]

    If network path doesn't work for your program copy input fles to your worker nodes

    xcopy <from> <to>

    


    Daniel Drypczewski


    Wednesday, December 12, 2012 8:29 AM

All replies

  • How about:

    Create new job

    job new - (output:<jobId>)

    Add tasks with input/ouput file as parameter

    job add <jobId> [/stdin:path\input_file1.dat] [/stdout:path\outpuf_file1.dat] app.exe

    Run job on cluster

    job submit [/Id:jobId]

    If network path doesn't work for your program copy input fles to your worker nodes

    xcopy <from> <to>

    


    Daniel Drypczewski


    Wednesday, December 12, 2012 8:29 AM
  • Sorry, I have been side tracked on other projects.

    Daniel, Do you mean that I can create a Job in the Job Manager GUI that will read the Standard Input stream for parameters, then put those parameters in the command string dynamically. Then when I go to run this Job on a folder of files with Powershell, all power shell has to do is call the correct Job ID and give it my parameters basically? Instead of Powershell having to create a whole new Job and task for each individual file. Am I reading your response right?

    Again sorry, very new to this environment, and thanks for the reply.

    Friday, December 14, 2012 8:38 PM
  • That was example for submitting hpc job on Windows console.In Powershell you need to play with New-HpcJob,Get-HpcJob,Add-HpcTask,Submit-HpcJob. Basically you create a new job only once and then add dynamically your tasks with different input parameters.Once you have finished set up just run your job once and all the tasks will be executed on cluster.

    Later on when the contents of your input files is changed you may clone your job and re-execute it without repeating the step of adding your tasks.


    Daniel Drypczewski

    Tuesday, December 18, 2012 3:47 AM
  • Ok, I got a script working, and it was far easier than I was expecting. I am still looking for how to specify Resource type in PS though. In the Job Management GUI when you create a new job you can specify if the job only needs a core, a socket, or a whole node. Any one know how to set the job to Core in PS? The MSDN for HPC cmdlets hasn't been much help.
    Wednesday, December 19, 2012 2:29 PM