Application modernization is the practice of updating legacy software for today’s new computing approaches, including new languages, frameworks, and infrastructure platforms. Rather than retiring an existing system or replacing it completely, modernization extends the lifespan of an organization’s applications while also taking advantage of technical innovations. This can be achieved by utilizing cloud computing, containerization, microservices, orchestration, DevOps, and automation.
This article will provide guidance on modernizing database stored procedures in hybrid computing environments. Modernizing database stored procedures is the process of transforming large chunks of stored procedures that contribute an essential part (business/data layer) of the applications so that they can meet today’s ever-changing business demands while leveraging the latest tools and technologies. We started application modernization in recent years but never talked about stored procedure modernization as a part of database modernization. But now there is a need for modernization to harness the benefits of containerization, as well as DevOps and cloud platforms. We will guide you to the most appropriate level in terms of the modernization of stored procedures.
The challenge is that heavy dependency on database stored procedures containing core business and data logic, written long ago, are monolithic in nature, not flexible or maintainable for moving around in the containerized and DevOps world. The goals for modernization include quick development, integration, and deployment, performance, scalability, availability, maintainability.
- To modernize the stored procedures by porting them into a container platform (either in vanilla Kubernetes or Red Hat OpenShift) like you do with application modernization
- To ensure that business-defined non-functional requirements (e.g., throughput, capability, security) remained “as-is” or better after modernization.
- To ensure that business functionality is as consistent as before, and the solution should also be cost-effective, scalable, maintainable, and reliable.
Point of view
As part of application modernization, database modernization of stored procedures should also be explored — moving to cloud-based, open source databases with optimized performance at a better cost.
It may not be always possible to modernize the entire database landscape with stored procedures due to complexities like non-functional requirements, compliance, existing code, etc. In such cases, the possibility of migrating to a modern database that is more compatible with the existing database should be explored.
Another option is to migrate to a new database in a phased manner, starting with less-critical and smaller applications (a big-bang migration approach should be avoided). A phased migration to convert stored procedures to a microservice results in benefits like a goal-oriented architecture, a loose coupling system, an easier to build and maintain system, flexibility and scalability, and autonomy within cross-functional teams and more.
We’ve assembled a sample business case to showcase as-is requirements, existing business pains, and how modernization will help new solution resolve its existing issues, as well as to make the application future-ready. The sample business case below will advise you to identify and select the candidate to modernize (stored procedures). Stored procedures that have issues as mentioned later where stored procedures with bad implementations are summarized could be the right candidates for modernization. Apart from this, stored procedures that are critical, complex, resource-intensive, or tightly coupled are also recommended for modernization. For example, in the sample business use case below, not all stored procedures can be modernized – just transaction, payment, ticketing, and money transfer types.
NOTE: On the other hand, some stored procedures related to batch processing, report generation, backups, and schedule jobs are recommended to be left as is, or optimized for better processing and performance, but this is out of scope for this article.
Sample business case
Let’s consider a fictitious utility company called FictUtil Co., which has the following as-is Utility Service Application (USA), which consists of application layers, middle-layer services, and a database layer. Major parts of business and data logic have been written in the stored procedures. The following diagram depicts how stored procedures are being invoked by an application to deliver data to a UI for business users to view in the browser.
The following diagram depicts the deployment perspective of the USA application. There are two servers: one web server for single-instance USA app and database server for a single instance of the USA database. Inside the instance, there two separate databases: a transaction and an account database. All business and data logic stored procedures are written in the transaction database. The account database is just for storage of user account balance and ledger information.
FictUtil Co. wants to modernize this monolithic application landscape. Stored procedure modernization is the core piece of the transformation.
The USA app consists of multiple catalogs of utility bill services like electric bill payment, mobile payment, credit card payment, booking services like travel tickets, and money transfer services. Customers can directly select one or more services and pay their monthly utility bills or can book their travel or transfer their money from one back to another. Money will be deducted from their wallet account.
FictUtil Co. has a set of stored procedures in a SQL server that is already optimized/steady in terms of satisfying the following needs for their USA business use case. Stored procedures that contain good implementations:
- Include stable and robust code
- Provide excellent performance
- Provide high security transactions/queries
- Provide low network traffic
- Contain access control
In terms of stored procedures that contain following bad/ugly implementations, one complex stored procedure (
SP_Perform_Utility_Transaction), for example, has been used to showcase pain points that make it a candidate for modernization. However, there are many such complex stored procedures in the system.
SP_Perform_Utility_Transaction stored procedures consist of different blocks of statements, mentioned below, along with relevant a code snippet and its issues.
There are more than 50 parameters of different types with inputs/outputs to perform one or more utility transactions. Many parameters are for getting transaction information from the caller application, and many are for returning relevant information to the caller application:
CREATE PROCEDURE [dbo].[sp_perform_utility_transaction](@user id int,@ServiceId int,@MerchantId int,@RetailerId int,@ServiceName VARCHAR,@Description VARCHAR,@Comment VARCHAR,@TxnAmount int,@SvcCharge int,@TaxAmout int,@ConvCharge int,@PGCharge int,@FromDate DATETIME,@ToDate DATETIME,@StatusId int,@BranchId int,@Count INT OUTPUT, .,.,.,.,.,.,….)
Local and global tables
Use of local and global temporary tables for saving some data to be shared across database sessions and for looping through the table to make decisions:
DECLARE @TempTable TABLE ( ProjectId INT, Recieved DATETIME, Concluded DATETIME, Comment VARCHAR(8000) ) CREATE TABLE ##TempCustomer ( [CustomerID] nchar(5) , [CompanyName] nvarchar(40) ) INSERT INTO ##TempCustomer ([CustomerID], [CompanyName]) exec dbo.uspOrderDetailsByCustomerId 'ALFKI' CREATE TABLE #TempCustomer ( [CustomerID] nchar(5) , [CompanyName] nvarchar(40) ) INSERT INTO ##TempCustomer ( [CustomerID] , [CompanyName]) exec dbo.uspOrderDetailsByCustomerId 'ALFKI'
Some cursors fetch main table data for decision-making and data processing. Repetitive programming loops and constructs, looping through cursors and taking some actions, then fetching data using subqueries and nested SQL statements:
DECLARE @DepartmentID int, @DepartmentName nvarchar(50), @PersonID int, @FirstName nvarchar(50), @LastName nvarchar(50) DECLARE department_cursor CURSOR FOR SELECT DepartmentID, Name FROM HumanResources.Department OPEN department_cursor; FETCH NEXT FROM department_cursor INTO @DepartmentID, @DepartmentName; WHILE @@FETCH_STATUS = 0 BEGIN DECLARE employee_cursor CURSOR FOR SELECT E.BusinessEntityID, P.FirstName, P.LastName FROM HumanResources.EmployeeDepartmentHistory H INNER JOIN HumanResources.Employee E ON E.BusinessEntityID = H.BusinessEntityID INNER JOIN Person.Person P ON P.BusinessEntityID = E.BusinessEntityID WHERE H.EndDate IS NULL AND H.DepartmentID = @DepartmentID OPEN employee_cursor; FETCH NEXT FROM employee_cursor INTO @PersonID, @FirstName, @LastName WHILE @@FETCH_STATUS = 0 BEGIN PRINT CAST(@DepartmentID as varchar(10)) + ' ' + @DepartmentName + ' ' + CAST(@PersonID as varchar(10)) + ' ' + @FirstName + ' ' + @LastName FETCH NEXT FROM employee_cursor INTO @PersonID, @FirstName, @LastName END; CLOSE employee_cursor; DEALLOCATE employee_cursor; FETCH NEXT FROM department_cursor INTO @DepartmentID, @DepartmentName; END CLOSE department_cursor; DEALLOCATE department_cursor;
SQL statement with multiple table joins of different types, such as inner join, outer join, left join, right join:
Select * from transactions t INNER JOIN users u On t.user_id=t.user_id LEFT JOIN retailers r ON r.retailer_id=u.user_id INNER JOIN account a ON a.account_id=u.user_id …. …. ….
Case and if statements
Multiple columns of the SQL have
if statement with nested queries with multiple joins:
Select EmployeeName, CASE WHEN Salary >=80000 AND Salary <=100000 THEN 'Director' WHEN Salary >=50000 AND Salary <80000 THEN 'Senior Consultant' Else 'Director' END AS Designation from Employee Select EmployeeName,Gender,Salary from Employee ORDER BY CASE Gender WHEN 'F' THEN Salary End DESC, Case WHEN Gender='M' THEN Salary END Select CASE WHEN Salary >=80000 AND Salary <=100000 THEN 'Director' WHEN Salary >=50000 AND Salary <80000 THEN 'Senior Consultant' Else 'Director' END AS Designation, Min(salary) as MinimumSalary, Max(Salary) as MaximumSalary from Employee Group By CASE WHEN Salary >=80000 AND Salary <=100000 THEN 'Director' WHEN Salary >=50000 AND Salary <80000 THEN 'Senior Consultant' Else 'Director' END
Wallet account details
Executing external stored procedures has fault-tolerance issues and may also make the server unresponsive. External stored procedures to get wallet account details from a database on a separate server/instance and update the account balance of the user based on transaction amount:
EXECUTE [LinkedServer_Server2].[User_Balance_Accounts].[dbo].[Update Account]
- Transaction and payment account tables have multiple triggers for audit and activity logging, creating performance issues since to update a single table transaction, there are five more tables needing to be updated/inserted for audit. These can be difficult to locate unless you have proper documentation because they are invisible to the client.
- Transaction start and commit block for data consistency and integrity can create performance issues/deadlocks.
- Exception-handling blocks are limited in terms of programming structure.
These bad sides of stored procedure implementations largely impact business continuity in current, modern, and next-gen competitive markets, and can lead to the following problems.
Scalability at stored procedure level:
sp_perform_util_transaction is the most common stored procedure. In the event of high concurrency, the same stored procedure will be called thousands of times on a single database server, jamming up the server and with likelihood of timeout and failure. Scalability in stored procedures is very much needed to balance load. Since this and similar other stored procedures are inside the database, you can’t scale at the stored procedure level. You scale at the database level — a costly solution because of increased hardware resources, processing, and networking. The following diagrams can help clarify.
Resource utilization issue: Due to a single centralized database server, non-scalable stored procedures, the complex nature of stored procedures, cursors, temporary tables, and nested queries, it consumes significant memory and processing capabilities of the database server. The database server is taking care of data storage, as well as a major portion of processing due to the business/data layer inside the database. In the case of deadlock, the server stops responding, making the application/system slow or unresponsive.
Change management, troubleshooting, and maintainability issues:
sp_perform_util_transaction and other similar stored procedures have thousands of lines of code written by multiple developers at different times, with minimal or no code documentation. Hence, it is difficult to troubleshoot issues or understand the business function for impact analysis for any modification/changes.
Revision control: You can’t maintain version control on stored procedures. Due to this, stored procedures can be overwritten. Also, it is difficult to know who did what because stored procedure history can’t be maintained.
Readability: Due to thousands of lines of code in
sp_perform_util_transaction, the complexity of the code, coupled with no documentation, readability is big problem for developers. Developers might need more time and effort to understand the code or find issues during troubleshooting.
Availability issues: Stored procedure availability totally depends on database server availability. You can’t make certain business logic stored procedures available when required. Like in this business use case,
sp_validate_transaction is being used heavily, but if the database were unavailable, this stored procedure can’t be used.
Development and deployment issues: In the USA app, stored procedures are generally created and changed in the dev/test environment and move directly to production environment manually or through a tool. Deployment is quick, but impact is huge if it fails.
Stored procedures need modernization to overcome these issues and to resolve the problem you have to optimize them inside the database or bring all stored procedures outside the database and modernize them to get the full benefits of today’s tools and technologies, such as cloud computing, containerization, microservices, orchestration, DevOps, and automation. Stored procedure optimization may not be the appropriate solution compared to modernization. Therefore, modernization options need to be explored as a solution.
Below are multiple application modernization options for stored procedure modernization:
- Rehost (Lift and Shift) – Can be done within database only.
- Refactor — Minimally altering or optimizing will not solve the inherent problem, and stored procedures will still be inside the database.
- Rearchitect — You can transform stored procedures into microservices/APIs outside the database with a mixed technology stack.
- Rebuild — Write new cloud-native code for all stored procedure microservices/APIs.
It is clear that rearchitect and rebuild will be the best option to modernize stored procedures into cloud-native microservices. In this option, you can take either the microservices architecture domain-driven design approach or one based on business functionality.
Your implementation approach could be steps of transforming stored procedures into APIs by breaking them down into microservices. At a high level, that will include:
- Separating business and data logic (join, union, lookups, etc.)
- Building microservices using business logic in API layer
- Establishing each microservice with its own data store
- Linking microservices data in API layers using change data capture
- Change data capture implementation for replicating/syncing data with the data stores of corresponding microservices (Debizium can be used)
We suggest the following steps to modernize stored procedures:
- One by one
- Introduce a separate database per dependent microservice
- Complete or full transformation
One by one: In this approach, you gradually convert one or more stored procedures into microservices. Identify those critical for modernization like the 4 of 50 below, which are critical and in need of modernization. Prioritize accordingly, convert into microservices, and plug them into the application and the same database, along with the remaining existing stored procedures still inside the database. Keep the existing stored procedures and create the microservices with an API gateway layer and an API manager, as depicted below.
Introduce a separate database per dependent microservice: Here, you start adding dependent database objects with respective microservices that have already been converted in the first step. You can have different lightweight databases for different microservices.
There may need to be replicating data between the data stores of corresponding microservices. For example, payment data changes should be updated in the billing data for the utility billing microservice to generate real-time reports to share with clients automatically. This can be realized using change data capture (CDC), which provides real-time or near real-time, low-latency, reliable, and scalable data replication between the USA microservices as business needs indicate.
Complete or full transformation: And in this step, you gradually convert all stored procedures into microservices and attach dependent lightweight databases to the respective microservices and connect all databases with a common change data capture layer.
Get complete microservices benefits like a goal-oriented architecture, a loose coupling system, an easier to build and maintain, flexibility, and scalability – and, most importantly, an autonomous nature within cross-functional teams.
|AS IS – Pain points||TO BE – Resolving pain points and making it future-ready|
|Scalability||When demand of USA increases, it’s easier to scale using all the microservices. When required, you can seamlessly increase resources to the most needed microservices, rather than scaling an entire USA app. This is because scaling is faster and often more cost-efficient as well. For example: If the USA app (microservices) were containerized and deployed in any market-leading orchestrated container platform (like OpenShift or Kubernetes), that can seamlessly be auto-scaled. If USA microservices are independently deployed as a container in an orchestrated container platform, that can automatically be highly available.|
|Fault isolation||You can reduce downtime through fault isolation. For example: If the ticket booking microservice fails, you can isolate this to prevent cascading this failure to others, which would cause the USA app to crash. That means this business-critical USA app can stay up and running even when one of its modules fails.|
|Maintainability||All these microservices typically will be having small codebases, making them easier to maintain and deploy. It’s also much easier to keep the code clean and for teams to be wholly responsible for specific services.|
|Change management||If any changes or new business requirements surface that need specific utility-related business functionality, the development teams who own a specific utility area-based microservice (e.g., utility billing microservice) can work autonomously without worrying about what’s going on with the rest of the USA app.|
|Troubleshooting||Since the USA app is now broken into a nice set of microservices, it will be easy to troubleshoot business or system failures by implementing an observability solution (e.g., EFK, Grafana, Prometheus, Istio/Kiali, Jagear, etc.). Since observability is a critical component and includes logging, monitoring, and debugging of cloud development platforms, this will shorten the time to troubleshoot by looking into the logs, metrices of the system, and microservices.|
|Security||Once you have the microservices ready, you can make the USA app a highly secure application by implementing container and orchestration, security by design, encryption, OAuth, DevSecOps, etc. and deploying on a secured environment.|
|Revision control||Because of microservices architecture, you will now have flexibility to use any version control system, like GitHub, Apache Subversion, AWS CodeCommit, etc., to eliminate revision-control issues at the stored procedure level because of its absence.|
|Readability||Popular languages, using modern framework, architecture style, design patterns, domain-driven design, test-driven design, etc. can be used to resolve stored procedure-level readability issues.|
|Development and deployment||Using microservices architecture, container, orchestration, and cloud together with DevOps tools make it easy to develop and deploy. This will help resolve development and deployment issues at the stored procedure level.|
The USA app is now ready to be fully assimilated with the technology-neutral target architecture as depicted below, where stored procedures are modernized to microservices.
The solution components at a high level:
- DevOps — This comprises continuous integration and deployment practices, offering business agility. This also has process/procedures for SRE, which is like L1/L2 support in the ITIL framework. This will enhance business operation turnaround time as the long-running stored procedure is now optimized to run the business process in the target architecture.
- API gateway — This layer is the integration from the legacy component like the web UI/front-end application to the modernized solution.
- Microservices — These are the chunks of business processes in new architecture constructs that were embedded in stored procedures. This functionality is exposed as an API, which can be consumed by other applications as well. This can also be tested using third-party tools like Postman and Swagger. This will also improve the quality and productivity by integrating with testing tools, writing unit test cases, and functional testing in the pipeline itself.
- Observability — This component is responsible for managing and monitoring the entire suite of microservices components. This will capture logs, metrices, and distributed traces of processes, which can be integrated and used for visualization of system performance and issue troubleshooting.
- Security — Authentication and authorization of services and users can be controlled by this layer. This is also responsible for data security at rest and as well as in motion.
- And CDC is for shared data across microservices
NOTE: The personas remain the same, but would require training to adapt to the new system. The web UI/front end remain the same, but would require integration to the API gateway.
Summary and next steps
Stored procedure modernization is a much-needed activity during your modernization journey. In this article, you have been offered an overview of how you can modernize traditional monolithic stored procedures containing most business and data logic into microservices, and leverage tools such as container, orchestration, DevOps, and cloud to eliminate existing issues, make future-ready, and more.
There could be many other solutions, but we’ve offered the most appropriate solution to a level from where you can proceed with modern tools. We suggest you bring out only stored procedures with the issues listed above and leave remaining ones as-is or optimize them. This is not the only method for stored procedure modernization, but we tried to focus on helping you understand one way to modernize stored procedures for a sample conventional monolithic application.
Modernizing stored procedures into microservices will help businesses quickly publish changes or resolve issues to provide the best customer experience and to get the full business benefits and best compete in the marketplace.