watson-header Mobile app designers have a lot of decisions to make when designing and building their apps. Adding cognitive capabilities is quickly becoming table-stakes. Cognitive computing is changing the way that entire industries process, understand, and respond to information. It is transformative both in how people engage with computing systems, and how information can be analyzed to optimize processes and advance business goals and capabilities.

Introducing the new IBM Watson iOS SDK (beta)

Of course, it’s written entirely in Swift. The IBM Watson iOS SDK (beta) makes interacting with IBM Watson services quick and painless.

The IBM Watson iOS SDK gives developers a Swift application programming interface (API) to simplify integration with many of the Watson Developer Cloud services, including the Watson Dialog, Language Translation, Natural Language Classifier, Personality Insights, Speech To Text, Text to Speech, Alchemy Language, or Alchemy Vision services – all of which are available today, and can now be integrated with just a few lines of code.

Getting Started

To get started you can download the framework source code from github.com/watson-developer-cloud/ios-sdk. To build the SDK you’ll need the Carthage dependency manager, then run the installation script. At this point, you’re now ready to compile and build the IBM Watson iOS SDK. Once you’ve compiled the SDK locally, then you’re ready to take advantage of the IBM Watson services inside of your mobile apps with ease. For additional detail, be sure to review the SDK documentation.

Implementing a Service

To leverage the features offered by the IBM Watson iOS SDK, you must also setup the corresponding services on IBM Bluemix. For example, if you want to take advantage of Text To Speech capabilities, you first need to sign into Bluemix and setup an instance of the Text To Speech service. Once that service is setup, you can start leveraging text to speech capabilities within your app via the IBM Watson iOS SDK.

//instantiate the TextToSpeech service
let service = TextToSpeech(username: "yourname", password: "yourpass")

//synthesize the audio
service.synthesize("Hello World", oncompletion: {
  data, error in  
  if let data = data {

    //play the audio
    let audioPlayer = try AVAudioPlayer(data: data)
Likewise, if you want to take advantage of the Language Translation services in Watson, you first have to setup an instance of the Watson Language Translation service. Just like the previous example, once the translation service is setup, then you’ll be able to leverage translation capabilities within your app with just a few lines of code.

//instantiate the LanguageTranslation service
let service = LanguageTranslation(username: "yourname", password: "yourpass")

//invoke translation methods
service.translate(["Hello","Welcome"],source:"en",target:"es",callback:{(text:[String], error) in
  //do something with the translated text strings

Sample App

I’ve also put together a sample application to demonstrate the simplicity of integrating IBM Watson services into your mobile apps. The sample, which is available at github.com/triceam/Watson-iOS-SDK-Demo, demonstrates integration the IBM Watson Language Translation service into an iOS app, with just a few lines of code.

swift-translator Be sure to check out the sample’s readme for additional detail and setup instructions. Make sure you’ve setup an instance of the Language Translation service on Bluemix and specified the appropriate login credentials, and you’ll be able to see it in action.


When using the IBM Watson iOS SDK, you’ll be using the Watson services running on Bluemix, and to leverage those services, you have to be authenticated. There are two methods that you can use to authenticate: the first option is to embed your service username and password in your code like I’ve shown above. This is the easiest method to get up and running. However, not everyone wants to embed their credentials, and we can understand that. The solution to add authentication without embedding credentials directly within your app is to use authentication tokens. In this case, you create an authentication proxy in Bluemix that obtains and returns a token to your client application, which can then use the token to call the service directly. For more detail on authentication, see the Watson services documentation and the IBM Watson iOS SDK readme.

Open Source

The beta project is open source. You can access complete source code for the project online at github.com/watson-developer-cloud/ios-sdk.

It’s beta, so expect updates as we refine it, and we encourage you to provide input along the way. If you find a bug, please submit an issue! We always appreciate a well written and thorough bug report. Want to make a source code contribution? Please do! Learn more about how you can get involved in the contribution guide.

Cognitive Computing

If you haven’t seen it yet, I strongly encourage you to check out this post on the Future of Cognitive Computing that has a great video introduction to the subject and summarizes the impact that cognitive computing is already having upon entire industries. IBM Watson with cognitive computing is a powerful combination. Adding mobile to the mix just makes it that much more powerful – you now have cognitive computing capabilities in devices that you carry in your pocket, everywhere that you go. It doesn’t stop there – cognitive computing together with IoT devices will provide unprecedented power into data and analytics. In fact, IBM just recently announced a new Watson IoT global headquarters designed to extend the power of cognitive computing to the billions of connected devices, sensors, and systems that comprise the IoT. The future is going to be awesome!

Andrew Trice (@andytrice), Developer Advocate, IBM Mobile

Join The Discussion

Your email address will not be published. Required fields are marked *