Build reliable, scalable, and autonomous apps with event-driven architecture

Traditional monolithic applications process data through SOAP and REST API messages. Modern architectures like microservices use API interfaces to bind multiple services based on context, functional requirements, and business requirements.

Previously, services were using a SOAP XML-RPC service oriented architecture (SOA). Services had a common agreement between the service and the client in defining the data structure and the service methods. This architecture follows the waterfall development model, where you first design and define your applications and then you implement and test them. This architecture made it difficult for application developers when requirements changed due to tight integration between server and client. So, this architecture didn’t fit well in the new agile world.

Now, developers rely on an architecture of REST with HTTP. This architecture has changed the way a client and service send and receive the data. HTTP is a transport protocol and messaging system because the payloads of requests and responses are messages. The REST architecture allows client and server applications to use their own DOA (Distributed Object Architecture) technologies, such as Java, Node.js, or ASP.NET, making them loosely coupled. The REST architecture transformed services into web services for client/server communication and microservices for server-to-server communication. This architecture led many companies to come up with API management platforms, such as IBM’s API Connect, that enabled headless APIs to integrate between enterprise business architectures.

But are RESTful APIs the only answer to modern agile development? Do they answer all the architectural problems in today’s fast-paced technology evolution? No, not really.

It is easy to build microservices across different platforms, but when you have many applications and each application has their own set of microservices as REST APIs, it will get difficult to standardize the workflows and manage them. Also, when you upgrade your platforms to better and newer emerging technologies, it’s difficult to migrate all the applications to support new technologies using REST APIs. There is a tremendous cost involved in implementing a new architecture solution for an application to completely change to a newer technology platform.

Some of the main challenges with RESTful APIs are how we can standardize the business workflow and how we can change the platforms to newer technologies using agile development without disrupting the existing workflow and with minimal changes. In this article, I will explain how we can address these issues and reduce the cost in supporting and maintaining legacy systems and migrating to better architectures and systems.

Workflow standardization

Industry standards keep changing, as global technology keeps evolving, such that enterprise solutions do not need to be rebuilt from ground up. To keep pace with your competitors, you likely are integrating third-party technologies or open source software that are available in cloud marketplaces into your solutions.

As you adopt the latest technologies into your current enterprise architecture, the legacy applications will get deprecated. These updates might involve a number of changes to integrated systems. The possibility of making these changes with minimal effort and without disrupting the current workflow depends on how well the whole system is architected. In the case of microservices with REST APIs, you’ll have a lot of redundant work.

One architectural solution is to adopt event-driven architecture. Events provide greater reliability and scalability, and they make components autonomous. Initially, events are mainly used for broadcasting notifications, but what if you used event streams instead of REST APIs in the workflow design?

Applications can subscribe to the topics and handle the events efficiently, as long as the workflow has the following design setup:

  • The right topics
  • A standardized event model
  • Standardized event handlers

New technologies can be easily adopted by subscribing to interested topics and by building prototypes or creating MVPs (minimal viable products). Then, applications can work on production data right away without scheduled migration scripts or without involving multiple teams.

The following example shows how we can standardize the workflow and experiment with new technologies. Figure 1 shows the RESTful APIs architecture, whereas Figure 2 shows the event-driven architecture.

Figure 1 shows 3 components: Component A, Component B, Component C, where Component C has a workflow setup through RESTful APIs. All 3 components use APIs to share data between them. Now, we want to prototype a new technology and create a new component that replaces Component B, which we will call B1. How can we make minimal changes to the workflow to adopt Component B1?

Figure 1

RESTful API architecture

The effort involved in adopting B1 involves understanding Component B’s APIs and the integration with Components A and C. This RESTful APIs architecture shows how tightly the components are coupled and the limited flexibility since all the three components must be updated to support the new component B1.

Figure 2 shows this same example, but using an event-driven architecture. If all the components are subscribed to topics and send messages with a standard data model, then it is easier to adopt new technologies without affecting the workflow.

Figure 2

Event driven architecture

The newly adopted component B1 doesn’t need to understand Component A and Component C, as long as it understands the topics in the event queue and the types of events that are being sent. This architecture makes all the components loosely coupled and allows new components to be adopted without affecting the workflow.

Event streams help build flexible workflow architecture solutions

Usually, a workflow will have a set of tasks that need to be executed in a sequence. Workflows can get complicated when we have asynchronous tasks. For simplicity purposes, we will say that the tasks are aligned to eventually have a single output that is delivered to rendering systems. So, we will divide the workflow into a synchronous or asynchronous pipeline that has tasks. Most of the synchronous tasks are built using message queues, whereas the asynchronous tasks are performed using event streams.

After every task is completed, you can implement an automated review or a manual review that will eventually provide output to the next task and review. In an agile world, the requirements keep changing. Tasks need to be autonomous modules and, if possible, need to be made into an autonomous application that just services the business requirement for that task. This workflow architecture makes the system much more flexible to be able to add, remove, or reorder tasks. This is a microservices architecture with an enterprise service bus.

You can achieve this event-driven architecture by using a product like IBM Event Streams, as it can provide autonomous applications that are independent and easy to integrate with each other. With IBM Event Streams, you can define the application based on data models and subscriptions to interested topics. IBM Event Streams is an event-streaming platform, built on open source Apache Kafka, that helps you build smart applications that can react to events as they happen. It provides reliability, scalability, and makes applications autonomous as shown below example.

Let’s consider a cloud content publishing platform example, where the content is published through a simplified workflow. We have an onboarding platform that will be reviewing and publishing the content, and we also have a globalization platform that will have reviews for the translated content and published in a country. See Figure 3 for this workflow.

Figure 3

Workflow for cloud content publishing platform

This workflow can be implemented with APIs or microservices. See Figure 4 for an architecture diagram that describes the interactions between the platforms through APIs.

Figure 4

Architecture that shows interactions between platforms through APIs

If the onboarding platform is a home-grown application that was developed with front-end frameworks and server-side platforms, and if we want to migrate everything into CMS vendor tools, then we might require the onboarding platform, globalization platform, and catalog to provide a ton of information regarding the microservice APIs. All 3 platforms will require you to create new APIs to support a new workflow by replicating the existing workflow with a new CMS. This architecture would require multiple teams to coordinate with each other in understanding the required APIs that are involved in the workflow.

Now let’s consider the same workflow, but using an event-driven architecture and a product like IBM Event Streams.

Figure 5

Standardized workflow with topics and events

If we standardize the workflow with well-defined IBM Event Streams topics and events as shown in Figure 5, then the workflow will become standard and any new platforms like the new CMS can easily subscribe to event queues to start participating in the workflow without any disruptions. This architecture also requires minimal changes to the integrating components. This event queue architecture makes migration much easier for adding new platforms.

Advantages of event-driven architecture over RESTful APIs and microservices

You can realize these advantages of implementing an event-driven architecture instead of RESTful APIs:

  • Reliability
  • Scalability
  • Autonomous

Reliability

Event streaming platforms like IBM Event Streams provide data-persistent queues for sending messages. This will make the components fault-tolerant when they go down. IBM Event Streams provides data persistency for about seven days, unlike other event streaming solutions.

Event-driven architectures are more reliable because you can broadcast messages to multiple components. Figure 6 shows an API architecture on the left compared to a event driven architecture on the right:

  • In the case of APIs, if one component misses an event, then Component 1 should make multiple retries and avoid sending the same event multiple times to all the others. This architecture makes it tightly coupled.
  • Using an event streaming platform like IBM Event Streams, we can make sure that components receive the events with less overhead on them. So, event streams are more reliable.
Figure 6

Reliability - API architecture on the left; Event driven architecture on the right

Scalability

You can scale any number of application instances by subscribing to a single topic. IBM Event Streams allows applications to group a set of instances to consume the event once or create multiple groups for multiple application components to consume the same event.

Event-driven architectures are scalable in that you can implement new technology platforms without interrupting the workflow. Figure 7 shows an API architecture on the left compared to a event driven architecture on the right:

  • In the case of APIs, you have a lot of dependencies between the components.
  • In the case of event queues, a product like IBM Event Streams will provide scalability by letting the new technology platform register to the service and start consuming messages without disrupting any of the workflow process.
Figure 7

Scalability - API architecture on the left; Event driven architecture on the right

Autonomous

Event queues remove dependencies between components and significantly simplify the integration of decoupled applications.

Event-driven architectures are autonomous in that you can deprecate APIs that multiple components use. Figure 8 shows an API architecture on the left compared to a event driven architecture on the right:

  • In the case of APIs, it is difficult to deprecate APIs because you need to identify all the components that are using the API, and then notify them to deprecate the API.
  • In the case of event queues, you don’t need to identify the components. Component C can stop sending messages and unsubscribe to the topic that it sends messages from.
Figure 8

Autonomous - API architecture on the left; Event-driven architecture on the right

Conclusion

An event-driven architecture is a very powerful and efficient architecture that helps standardize workflows. An event-driven architecture has greater flexibility and enables you to easily adopt and prototype new technology platforms without any major changes to the overall architecture or individual components of your applications. This flexibility in-turn will reduce the cost and time involved to migrate to new technology platforms.

What nextr? Learn more about the advantages of an event-driven architecture and the architectural considerations for event-driven microservices-based systems.