I joined the world of enterprise technology at an interesting crossroad – where many companies who have been developing in SOA for years are now working towards their microservices journey.
The initial questions that crossed my mind when introduced to this concept were – What does a microservices architecture look like? What implications does it have? Why is this beneficial in comparison to SOA? To understand these questions, our team decided to build it. We thought to ourselves – how would others adapt to this new architecture? So we put ourselves in their shoes. Our team was made up of 5 people – we had our Solution Architect, Front-End Developer, API & Node.js Developer, DevOps Engineer and Business Logic Developer.
We thought of a use case – let’s act as if we’re an insurance company. What does a typical day look like? What are the insurance company pain points? We decided to create an application that would allow an end user to generate an auto insurance quote.
We focused on three points when creating a quote from the insurance company’s perspective:
- Mutability – can I change the business rules I have in place today?
- Reduce time to production – how quickly can I change these rules to having them in production?
- Consumption of cloud services – how can I leverage public weather data to see if there is there a severe weather event that is going to impact my insurance quote?
The flow of our application looked like this – the end user would enter their account, vehicle and coverage information; receive approval/denial from our rules engine that analyzed driver history and zip code weather data; and would then generate a quote. Our back-end account, vehicle and coverage services were written in the microservices architecture and our rules engine was our “legacy monolithic java application.”
The majority of large enterprises cannot wake up one morning and rebuild their entire systems in a microservices architecture. So we wanted to highlight how an SOA and microservices architecture can co-exist as every company moves along on their journey to cloud.
During this project, I developed back-end services written in Node.js. Originally I was writing these services in a monolithic application style where all services affect one data source. But after many conference calls, debates and arguments we decided to migrate to a microservices architecture. This meant I would have to refactor my monolithic application style and create 3 separate Node.js applications linked to 3 separate Cloudant NoSQL databases in order to isolate the business functionality. For the isolated services to talk to each other, I exposed rules for interaction via APIs. So was the headache worth the benefits? Absolutely.
I began my journey by writing on-prem back-end services and, soon after, migrating them to IBM Cloud. This consisted of creating environment variables and no longer hardcoding port numbers, URLs and authorization credentials into my applications. This allowed all of the development teams on the project to access the services and to provision new releases of my services faster due to the IBM Cloud built-in DevOps lifecycle. This helped the other developers on the team consume the latest services to progress their offerings as well.
Many ask what we saw as a result of making this decision and moving to our microservices architecture:
By utilizing the microservices architecture and creating loosely coupled services I was able to edit code from one service without impacting the availability of the entire application. This allowed me to deliver faster business results and saved me time! This increase in productivity was a perfect example as to why we were confident in our architecture decisions. Although we were a smaller team, we imagined that each service would have a team dedicated to it. This would enable companies to modify how they organize people – to focus on business functionality rather than technology.
Another benefit that I realized was speed to innovation – if I was working on a service and wanted to introduce a new technology, the other services could remain up and running regardless. One example was Docker containers. An industry trend that I learned was that developers who traditionally wrote monolithic style applications are moving towards designing their applications in a microservices-based architecture in order to take advantage of containers.
As I began to wrap my first application in a Docker container, my other 2 Node.js applications remained untouched, however, still up and running. I could use one service as my playground, to develop, run and test various languages, runtimes and technologies. We also could scale the services independently based on need. So for our application, if someone would go through screen by screen to generate a quote, it is likely that some would exit the application before generating a quote. This means that our first screen – the account service would receive more traffic than the last screen – the coverage service. Using containers, I could scale these services separately to be more efficient. This flexibility created independence for developers to make decisions without having to wait on others to catch up.
FYI – the microservices world wasn’t all fun and games. We faced challenges such as data duplication and testing. However, we created stubs to ensure sufficient testing and created boundaries to ensure that all pieces of data had a single source of truth.
So what’s next on my journey to microservices? As we migrate our services to Docker containers we’re creating a fail-fast environment where we can test new technologies to bring rapid innovation with limited consequences. However, we lack our orchestration level – which has introduced the need for Kubernetes. Also, by exposing rules for the interaction of services via APIs we’ve introduced the need for API management. So stay tuned to see what’s on the way!