none
How to handle monolith / a set of stored procedures for future

    Question

  • I have a strange situation. I have few stored procedures which is connected to a scheduler (a windows service or a control-M agent). These stored procedures have business logic in it and it is mainly update operation such as updating a flag in a table by joining many tables and conditions / business rules etc. Most of the stored procedures are written 10 years back and it is 1000s of lines. Now the question is if i want to upgrade these to new technologies from monolith, what is the best way in below scenarios?

    1) Currently it is in SQL Server / on premise. I can analyse the stored procedures and modularize into service based apis and do the same.

    2) In a micro servce / domain driven architecture world I can even think of making bounded contexts in domain driven based. I can bring these to the api with the use of LINQ and rewrite the entire set of sps. So it is all dismantled as smaller sets / services.

    3) In a cloud scenario i can think of create web jobs which act like a scheduler and i can connect to azure function and i can do the update. This approach is with usage of Azure SQL instead of the current on-premise SQL Server database (assuming the complete utilization of cloud resources with Azure SQL and not on premise).

    4) In another level i can convert these logic to SQL CLR functions but in azure SQL i heard it is discontinued or not supported.

    Now putting all these thoughts around what is the best approach (in any angles - monolith v/s modern v/s DDD v/s microservices v/s cloud). What is the pros and cons of each approach. Any suggestions are welcome.

    Please note that i am not seeing a great business benefit, re-usability or scalability in doing this. Only technology advancement and nothing else. If i gain any benefit i will be happy t One point is that i should be able to achieve testability at the end plus usage of advanced technology.

    Sunday, April 22, 2018 12:39 AM