Pages

Friday, May 19, 2017

IBM PowerAI on OpenPOWER system eases and speeds access to AI, Deep and Machine Learning

By Rich Ptak

Image courtesy of IBM, Inc.
IBM  launched the latest additions to its PowerAI Platform[1] for AI, Deep and Machine Learning with an analyst briefing session (which we attended) along with presentations at the NVidia’s GPU Technology Conference (GTC)[2] in Silicon Valley.

PowerAI is an integrated software distribution of open-source deep learning frameworks (TensorFlow, Caffe, Torch, etc. – see graphic at right) running on an IBM® Power System™ server. It is targeted at Data scientists both experienced and just getting started as the face some serious entry roadblocks due to the amounts and variety of raw data they work with and existing modeling processes. These roadblocks are addressed in IBM’s new release. Here’s what we learned along with our opinions about the product.

The difficulty to be overcome

Three significant tasks have frustrated Data Scientists working in Deep and Machine Learning. One was the effort required for data extraction, transformation and loading (ETL). The second was the time-/effort-intensive manual process as models are refined, trained and tuned for optimal performance. Massive effort and time were spent at transforming and loading diverse data types into Data Lakes and Data Stores able to feed into existing analytic and modeling tools. Finally, were the manual processes to train, revise and optimize standardized industry-focused models to fit a specific operational model the Data Scientist was building.

PowerAI was initially announced in November 2016. IBM set out to make PowerAI the leading open-source, end-to-end cognitive application platform for Data Scientists and developers. The three goals set at PowerAI’s very beginning were that it should be:
  1. Fast and easy to install and deploy (in hours) using common frameworks;
  2. Deliver optimal performance so it was designed with frameworks and libraries for tuning to achieve peak throughput;
  3. Tuned for unique performance and scaling by using the latest and emerging hardware architectures and technologies, e.g. GPU, CPU, interconnection, etc.

Thus, simplification, ease of use and adaptability were key design goals. The plan is to use automation and integration to deliver a set of tools needed by data scientists and experts in basic tasks such as data transformation, model building and optimization. Becoming a leader in deep learning required more than focused product activity by one vendor. IBM recognized the need for an outstanding system platform, an integrated open source stack and a dispersed open-sourced ecosystem of partners, suppliers and innovators. IBM also participated in efforts to pioneer and support many of the best practices used in deep learning today.

The hardware underpinning the software stack is the OpenPOWER system announced last fall at the first OpenPOWER European Summit[3] (Barcelona, Spain). It is the IBM Power S822LC for High Performance Computing (HPC) system. Joined with the NVidia pioneered CPU-to-GPU NVLink, the server is currently the best-in-breed system for deep learning projects.

The OpenPOWER Platform system has proven to be popular with developers and users around the world. We describe the system and chip specifics in earlier articles. See our discussion on the system at: “The newest IBM Power Systems – more of everything for the hottest environments![4]”, and the chip at the heart of it all: “Acceleration, Collaboration, Innovation - IBM's roadmap for its POWER architecture[5]”. Also, key to building the success of PowerAI are existing partnerships and more to come with industry leaders like Julia, DL4J, Apache Spark, Anaconda, DL4J (DeepLearning4J), OpenBLAS, ATLAS, NumPy, docker, etc.

Additions to PowerAI

                    Figure 1 The PowerAI Software Stack             Image courtesy of IBM, Inc.
Figure 1 illustrates the PowerAI software stack. The new additions include new data preparation and transformation (ETL) tools that automate, speed deep learning capabilities. New cluster orchestration, virtualization and distribution capabilities using Apache Spark. Tools to speed and automate the development process for data scientists. Adding up to faster training times (for model building and validation) by distributing deep learning (processing) across a cluster.
The result is a multi-tenant, enterprise-ready Deep Learning platform that includes:
  1. AI Vision – a custom application development for use with Computer Vision workloads;
  2. Data ETL accomplished using Apache Spark;
  3. DL (Deep Learning) Insight (Automated Model Tuning) – automatically tune hyper-parameters for models using input data coming from Spark-based distributed computing with intuitive GUI-based developer tools that provide continuous feedback to speed creation and optimization of deep learning models;
  4. Distributed Deep Learning –  HPC Cluster enabled distributed deep learning frameworks that automatically share processing tasks to speed results while also accelerating training (of models) with auto-distribution using Spark and HPC technology from TensorFlow and Caffe.

In summary, PowerAI uses automation, integration and AI methodology to speed and simplify the whole process of model building and testing. It provides Apache Spark-based data extraction, transformation and preparation tools for Data Scientists with extensive experience in Deep Learning. It provides automated, distributed model tuning and testing to speed the overall process by eliminating tedious manual comparison and analysis. Experienced and entry level data scientists will benefit significantly from these tools that simplify data preparation, model building, testing and tuning.

IBM offers PowerAI on-premise today and it will eventually be available as an IBM Cloud service. They expect that for a variety of compliance, security and capacity reasons, most users will opt for the on-premise solution. Basic PowerAI capabilities are available for free. Enterprise extensions are for fee with support/consulting services available.

The Final Word

With this release, IBM has convincingly speeded up and simplified major tasks associated with data ETL and manual processes for model training, tuning and optimization. The results benefit both experienced and entry-level Data Scientists working in Deep and Machine Learning.

From the beginning, IBM has worked to build and maintain ties with the larger open source community. They continue to expand the size of the community, cooperating with major players to integrate new technologies and capabilities.  

With the introduction of the PowerAI Platform and OpenPOWER server, IBM stands at the forefront in providing an integrated toolkit and platform as a comprehensive entry way to AI development for data scientists from mid-range to full-enterprise sized organizations. At the heart of the needed ecosystem is the open source deep learning community and association with such open source communities as that surrounding OpenPOWER. We encourage you to look more deeply into what PowerAI is offering, both now and in the future.


No comments:

Post a Comment