Archived | Apply cognitive to mobile images on the go

Get the code Watch the video

Archived content

Archived date: 2019-05-21

This content is no longer being updated or maintained. The content is provided “as is.” Given the rapid evolution of technology, some content, steps, or illustrations may have changed.


BluePic is a photo and image sharing sample application that allows you to take photos and share them with other BluePic users. This sample application demonstrates how to leverage, in a mobile iOS application, a Kitura-based server application written in Swift.


BluePic takes advantage of Swift in a typical iOS client setting, but also on the server-side using the new Swift web framework and HTTP Server, Kitura. An interesting feature of Bluepic is the way it handles photos on the server. When an image is posted, its data is recorded in Cloudant and the image binary is stored in Object Storage. From there, an IBM Cloud Functions sequence is invoked causing weather data like temperature and current condition (e.g. sunny, cloudy, etc.) to be calculated based on the location an image was uploaded from. AlchemyAPI is also used in the IBM Cloud Functions sequence to analyze the image and extract text tags based on the content of the image. A push notification is finally sent to the user, informing them their image has been processed and now includes weather and tag data.



  1. A client from an iOS device or web browser connects to the Kitura Mobile Backend.
  2. The clients are optionally able to authenticate. On the iOS devices they leverage the AppID service in IBM Cloud.
  3. At this point a client is able to take a picture (on the iOS client) and upload the image (on both clients).
  4. The Kitura Mobile Backend will first create an entry in a Cloudant NoSQL DB.
  5. The Kitura Mobile Backend will then store the file in the IBM Cloud Object Storage.
  6. IBM Cloud Functions are triggered with Kitura, which writes into Cloudant DB and Object Storage, and then triggers. These actions include invoking the Watson Visual Recognition service to analyze the image.
  7. In parallel, IBM Cloud Functions also gets location data from where the image was taken, from the IBM Cloud Weather Company Data service.
  8. Cloud Functions will take the returned data from AlchemyAPI and the Weather Company Data services and update the Cloudant NoSQL DB entry.
  9. Finally, Cloud Functions will trigger an IBM Cloud Push Notification event for the iOS client.


Ready to put this code pattern to use? Complete details on how to get started running and using this application are in the README.