Using native dll, using Intel MKL in Azure RRS feed

  • Question

  • Good day.

    How i can compile c++ library to use it in Azure:

    1. Not depending on Intel MKL;

    2. Depending on Intel MKL;

    Unfortunately there is not enough information concerning it.

    I am developing Bizspark project heavily using vectorised machine learning and some other libraries.

    So i would like to understand, how to better setup roles.

    I need to use native dll from mvc core application, in hybrid backend, in development scenario doing some researches.

    I have to port some working desktop code to Azure. 

    Typically i use C# task scheduler so let's say i ll be happy to compile and use one thread native dll. Intel MKL greatly supports many options and let compile switching off omp and its own threading. 

    It would be great if it is possible to compile c++ in general and especially Intel MKL with omp setups and call its functions(methods) by PInvoke like on desktop but not sure if it s really possible. But who knows.

    Everybody s telling general words on virtualization but i am sure there are boundaries when physical real code execution begins. I am sure it cant be difficult. May be some limitations - exclude parallelisation and thread synchronization calls.

    At the same time we know that at least MongoDB is working. But it seems to me server side of mongo is done without any .net driver.

    So i would like to know how better compile c++ code and to get more information on environment:

    1. supported instuctions, will vectorization directives work? 

    2. Wich namespaces in C++ and C# may be used to get information at runtime on cpu cache level size for better cpu cache fit setups.

    Unfortunatel have not found anything relevant in any Microsoft help directory.

    And as for Amazon Web Services - they have even easy guides even with samples and pictures) for using c++ amp and cuda.

    Friday, November 4, 2016 1:34 PM


All replies