|Image courtesy of IBM, Inc.|
The difficulty to be overcome
PowerAI was initially announced in November 2016. IBM set out to make PowerAI the leading open-source, end-to-end cognitive application platform for Data Scientists and developers. The three goals set at PowerAI’s very beginning were that it should be:
- Fast and easy to install and deploy (in hours) using common frameworks;
- Deliver optimal performance so it was designed with frameworks and libraries for tuning to achieve peak throughput;
- Tuned for unique performance and scaling by using the latest and emerging hardware architectures and technologies, e.g. GPU, CPU, interconnection, etc.
Additions to PowerAI
|Figure 1 The PowerAI Software Stack Image courtesy of IBM, Inc.|
- AI Vision – a custom application development for use with Computer Vision workloads;
- Data ETL accomplished using Apache Spark;
- DL (Deep Learning) Insight (Automated Model Tuning) – automatically tune hyper-parameters for models using input data coming from Spark-based distributed computing with intuitive GUI-based developer tools that provide continuous feedback to speed creation and optimization of deep learning models;
- Distributed Deep Learning – HPC Cluster enabled distributed deep learning frameworks that automatically share processing tasks to speed results while also accelerating training (of models) with auto-distribution using Spark and HPC technology from TensorFlow and Caffe.