Recently, I developed an anomaly detection system on real-time IoT sensor data using an LSTM and an autoencoder. This basically proves that a Deep Learning algorithm also outperforms state-of-the-art algorithms for structured (time-series) modelling. This algorithm is getting a lot of traction now as it is used by a Fortune 500 company in Switzerland. You can read more about this anomaly detection system and how to use different deep learning frameworks like DeepLearning4J, Apache SystemML, Keras and TensorFlow to create it in my series, “Developing cognitive IoT solutions for anomaly detection by using deep learning.”

But as you can see when reading this tutorial, it takes quite a while to read through and understand it. It also takes quite some effort in making the code run on your system. Wouldn’t it be nice to just push a button, and have the deep learning algorithm exposed as a Swagger-backed REST service? To put it straight, IBM is doing it and in an open source, open-standards-based way – all for free! But one thing at a time…

Dr. Angel Diaz stated in a recent blog, “With the advent of open source deep learning engines like TensorFlow, PyTorch, Keras, and so on, there’s a rapidly growing need for skills and technologies that provide a consistent and standardized way to interact with these different machine learning engines. We need to drive standardized approaches in the industry (for example, ONNX) to ensure we are all marching towards a common goal with a common set of standards and technologies. In addition, we want these technologies to be democratized so that they’re easily accessible to and consumable by developers, both in open source and enterprises.”

ONNX, Open Neural Network Exchange, is on the right track. Currently Caffee2, Chainer, Cognitive Toolkit, mxnet, PyTorch and PaddlePaddle are supporting this open source neural network interchange format for AI models. Through converters, CoreML and TensorFlow are also supported. (You can read more about the IBM Open Tech AI strategy in my previous blog.)

While having an open standard emerge is very nice, IBM again is leading the Open Standard transformation with the IBM Code Model Asset Exchange (watch me introduce the IBM Code Model Asset Exchange at the OpenTech AI Summit). The core idea behind this exchange is not only sharing of deep learning models but comes in a bigger package as stated by Dr. Angel Diaz. It’s not only about the pure model. It’s also important to derive all the meta data which lead to the model. This includes the training data but also the machine learning pipeline which lead to the model. I’ve had a closer look at the already available models on the IBM Code Model Asset Exchange, and they are looking very promising. Models exist for Computer Vision, Natural Language Processing, Video Processing, and even cryptography. Using Docker, I had my REST API up and running on my laptop in just a couple of minutes using three commands:

git clone …

docker build …

docker run …

And, using Watson Machine Learning, those models can directly be made available in the IBM Cloud in a scalable way or of course using Fabric for Deep Learning in any other environment – completely without IBM vendor lock-in. Again, please feel free to contribute your models. I’ll do the same with my time-series anomaly detector mentioned before, I promise!

Join The Discussion

Your email address will not be published. Required fields are marked *