Microservices. Everyone’s talking about microservices these days because they’re pretty awesome if you do them right. They let you add cool new functionality in a short amount of time (again, in the right circumstances – they’re not the solution to every problem). At QCon New York we were talking to developers about the kinds of cool things you can do with microservices and why the new Java EE 7 functionality in WAS Liberty gives you even more options to build cool services.

We hosted a Code Rally contest, where developers build AIs to enter races around virtual tracks. Code Rally uses WebSockets and other Java EE 7 technologies to connect clients (laptops running Eclipse) to the WAS Liberty server on IBM Cloud that simulates the races.

If we’re honest, we had to acknowledge that creating an intermediate AI using the Eclipse IDE is a little awkward. Then it struck us:¬†why not create a microservice that monitors Twitter and lets people configure and race their AIs through Twitter DMs.

Less than 24 hours later we had the microservice built and working, with hundreds of virtual races being kicked off by attendees.

How did we build this new microservice using Twitter?

The new service was built with Node-RED: A Twitter node monitors direct messages to @CodeRally, and translates that into a REST request to Java EE web application running on WAS Liberty. The web application interprets the messages to create and run AIs. Both of these components are running on IBM Cloud and were really easy to setup and get running. By using existing APIs we were able to create the Java EE web application quite easily, and it is running as a separate web app from the rest of the Code Rally game service.

So, what does the setup look like?

The Node-RED component looks like this in the editor:


nodered_microservice

The Java EE component is built out of 10 classes and runs everything in memory on a Liberty buildpack instance on IBM Cloud. A future enhancement will be to store information on everyone’s AI and where they are in the setup process in a database. This will give some non-volatile state to their AI selections, but for now we’re purely in memory. The beauty of the setup is how small and simple it all is, and how we can quickly iterate on it and push updates out without impacting the main race server (which is a completely different deployment and a bit of a monolith).

We’re currently waiting on Twitter to remove the DM length limit which was announced in June as, in the current version,¬†we send several messages at once to a user. Unfortunately, the order in which a user receives these messages is out of our¬†control (beyond having a 1 minute gap between the receipt of each message). Once the DM length limit is removed we can combine the several small messages into one large message and know that the content is in the correct order.

Join The Discussion

Your email address will not be published. Required fields are marked *