Create Swift mobile apps with IBM Watson services
Lab 1: analyze sentiment
Lab 1 overview
Create a Watson sentiment analysis app with Swift
Create a typical application in Swift
Install Carthage and add the Watson SDK to your project
Create the Watson service and get the key token for it
Lab 1 Summary
Lab 1 solution part 1
Lab 1 solution part 2
Lab 1 solution part 3
Lab 2: recognize images
Lab 3: convert text to speech
Learn how to write three mobile apps in the Swift language on iOS that use the IBM® Watson™ Cloud Developer SDK to access the Watson service.
Learn, also, how to write simple but cool cognitive applications that use the following Watson services:
- AlchemyLanguage (Watson Natural Language Understanding)
A collection of APIs that provide text analysis by processing natural language. For this course, you’ll use the Sentiment Analysis service to identify the sentiment in text.
- Visual Recognition
Analyzes images for scenes, objects, people, signs, and other content. With this service, you’ll provide a URL to an image that your application will identify.
- Text to Speech
Synthesizes natural-sounding speech from input text in a variety of languages and voices that speak with appropriate cadence and intonation.
IBM Watson Cognitive services, which are hosted on IBM Bluemix (IBM Cloud), can be accessed through RESTful API calls. You’ll learn how to call these services from your Swift application.