.NET Application not using all Nodes/Cores RRS feed

  • Question

  • We have a .NET application written in C# that utilizes threading to data mine through thousands of files, this works on a single machine but it was maxing out the CPU so we decided to try a HPC solution.

    Now when the job is executed through the head node, it only picks one node and runs the job exclusively on that machine. We would like it to use all the available cores, including the ones on the head node. I have tried to specify the min and max cores on both the job level and task level to be the maximum number of cores we have available (24), but this has no effect.

    I would like to know if there is any way to run the application through the HPC Cluster where it uses all the cores, whether this involves further configuration of the job/task or if there is something that can be changed on the application to get it to see all the cores when executed through the HPC Cluster. We are using ThreadPool in the C# application.

    Thursday, April 18, 2013 8:22 AM