page-brochureware.php

Watson Machine Learning Community Edition

Get started faster with a software distribution for machine learning that runs on the enterprise platform for AI

Watson Machine Learning Community Edition (formerly PowerAI) Release 1.6.1
Release date: 06/14/2019

What’s new

Watson Machine Learn Community Edition (WML CE) builds upon the previous releases of PowerAI and includes the following updates and new features:

  • To better align with IBM’s integrated AI portfolio, PowerAI is now part of the Watson Machine Learning family and has been renamed Watson Machine Learning Community Edition (WML CE).
  • Support for both IBM Power and accelerated x86 architecture servers, allowing clients to select the platform that best meets their performance requirements.
  • Conda packages for Keras-team Keras. Keras-team Keras is configured to run with the Tensorflow back-end, and is also configured to operate with Tensorflow Large Model Support (TFLMS).
  • TensorFlow Serving and TensorFlow Serving API 1.14 are now included. TensorFlow Serving is a flexible, high-performance serving system for machine learning models, designed for production environments. Learn more: https://www.tensorflow.org/tfx/serving/architecture
  • NVIDIA Data Loading Library (DALI) 0.9.0 is now included. NVIDIA DALI is a collection of highly optimized building blocks and an execution engine to accelerate input data preprocessing for deep learning applications.
  • Framework version updates include: TensorFlow to 1.14 and 2.0 (tech preview), PyTorch to 1.0.1, RAPIDS cuML and cuDF to 0.7.2.
  • NOTE: TensorFlow changed the environment variable name TF_CUDA_HOST_MEM_LIMIT_IN_MB to TF_GPU_HOST_MEM_LIMIT_IN_MB. This may affect scripts written to use models with TensorFlow Large Model Support. See Getting started with TensorFlow large model support (TFLMS) in the Knowledge Center for more information.
  • Technology previews include:
    • NVIDIA TensorRT
      TensorRT is a C++ library provided by NVIDIA which focuses on running pre-trained networks quickly and efficiently for the purpose of inferencing. Full technical details on TensorRT can be found in the NVIDIA TensorRT Developers Guide.
    • torchtext and pytext natural language support
      Torchtext is a companion package to PyTorch consisting of data processing utilities and popular datasets for natural language. PyText is a deep-learning based NLP modeling framework built on PyTorch and torchtext.
    • NVIDIA APEX Automatic Mixed Precision and optimizer support
      Apex is a PyTorch add-on package from NVIDIA with capabilities for automatic mixed precision (AMP) and distributed training. Note: Apex is currently only provided for Python version 3.6.
    • TensorFlow 2.0
      The focus of TensorFlow 2.0 is simplicity and ease of use. TensorFlow 2.0 leverages Keras as the high-level API for TensorFlow.
    • snap-ml-spark library
      Snap ML is a library for training generalized linear models. It is being developed at IBM® with the vision to remove training time as a bottleneck for machine learning applications.
    • License information for the technology previews is found on the Technology Preview Code page in the IBM Knowledge Center.

For more details, including updated and deprecated packages, see the What’s New topic in the IBM Knowledge Center.

Key features

  • WLM CE is distributed as prebuilt containers, or on demand through the Conda provisioning process.
    • All of the Conda packages are available in a Conda channel
    • There is no install package to download, instead connect to the Conda channel and install your packages from there
    • Package dependencies are automatically resolved
    • Delivery of packages is open and continuous
  • You can now run more than one framework at the same time in the same environment. For example, you can run TensorFlow and PyTorch at the same time.
  • Supported hardware:
    • IBM Power System AC922 with NVIDIA Tesla V100 GPUs
    • IBM Power System S822LC with NVIDIA Tesla P100 GPUs
  • Supported operating systems:
    • Red Hat Enterprise Linux 7.6
    • Ubuntu 18.04

For the full release notes and README, including software packages and prerequisites, start with the WML CE planning topic in the IBM Knowledge Center.

How to get WML CE 1.6.1

There a several ways for you to get WML CE 1.6.1.

Learn more

Previous PowerAI releases

We recommend that you install the most current release of WML CE, however, if you have an earlier version of PowerAI installed, you can find release information in the IBM Knowledge Center:

Requesting enhancements for WML CE

The IBM Request for Enhancement (RFE) tool is now available for you to submit formal enhancement requests to the WML CE development team. One of the benefits of using the RFE tool is that other clients can vote on submitted requirements, which helps IBM to prioritize requests.

Get started

Go here get started: ibm.biz/powerai-rfe

The RFE for WML CE pages are part of IBM Developer and require that you sign in with an IBM ID to submit or vote on a request. You should make sure that your IBM ID profile includes your current company and your email address to ensure that we can contact you if we have questions.

Search first

Once on the RFE page, click on the “Search” tab to view existing requests before you submit a new request. It is much more useful to vote for a previously submitted request than to submit a duplicate request.