Create an iOS app that uses built-in and custom classifiers

Get the code

Summary

In this developer code pattern, use IBM Watson™ Visual Recognition to showcase various built-in and custom classifiers on IBM Cloud using an iOS app built using Swift. A user can open the app on an iOS-based mobile phone and choose the different classifiers (faces, explicit, food, etc.) they want to use, including custom classifiers. The Watson Visual Recognition service on IBM Cloud classifies and provides the app with the classification results.

Description

The app in this code pattern has support for the following features of Watson Visual Recognition:

  • General — Watson Visual Recognition’s default classification returns the confidence of an image from thousands of classes.
  • Food — A classifier intended for images of food items.
  • Explicit — Returns percentage of confidence of whether an image is inappropriate for general use.
  • Custom classifier(s) — Gives the user the ability to create his own classifier.

Flow

flow

  1. Clone the repo.
  2. Install dependencies with Carthage.
  3. Set up Watson Visual Recognition credentials.
  4. Run the app with Xcode.

Instructions

Ready to check it out? See the all detailed info on GitHub.