Artificial intelligence (AI) will be the most disruptive class of technologies over the next decade, fueled by near-endless amounts of data, and unprecedented advances in deep learning. Recently IBM announced new Watson Studio that expands on Watson Data Science Experience. The offering has been designed to make it easier to develop, train, manage models and deploy AI-powered applications and is a SaaS solution delivered on the IBM Cloud.
New IBM Watson Studio extends the capabilities around deep learning, including TensorFlow scoring allowing you to access pre-trained models from the Watson Services such as Watson Visual Recognition. It also enables you to bring in and analyze unstructured data further automating and providing insight into model management continuing to provide you with a choice of data science tools strengthening the drag-and-drop interface to build analytics models using SPSS Modeler.
For visual modelers, IBM provides two tools:
- SPSS Modeler: an intuitive interface that is easy for everyone to learn and use — from business users to data scientists. Uncover valuable insights quickly for rapid time-to-value just by using drag-and-drop capabilities. There are thousands of Modeler users out there and the best news for them is that they can import and bring their SPSS streams within the Watson Studio — they will work and render as expected!
- Neural Network Modeler: an intuitive drag-and-drop, no-code interface for designing neural network structures. It speeds up the design process by avoiding the need to write and debug code by hand. Neural networks can be exported in TensorFlow, Keras, PyTorch and Caffe as well as in JSON format for sharing within blogs and code posted to GitHub.
Deep Learning as a Service
Training of deep neural networks, known as deep learning, is currently highly complex and computationally intensive. It requires a highly-tuned system with the right combination of software, drivers, compute, memory, network, and storage resources. To realize the full potential of this rising trend, this technology is now more easily accessible to developers and data scientists so they can focus more on doing what they do best – concentrating on data and its refinements, training neural network models with automation over these large datasets, and creating cutting-edge models.
With the launch of Deep Learning as a Service within Watson Studio, organizations can overcome the common barriers to deep learning deployment: skills, standardization, and complexity. It embraces a wide array of popular open source frameworks like TensorFlow, Caffe, PyTorch and others.
Deep Learning as a Service has unique features, such as Neural Network Modeler, to lower the barrier to entry for all users, not just a few experts. The enhancements live within Watson Studio, the cloud-native, end-to-end environment for data scientists, developers, business analysts and SMEs to build and train AI models that work with structured, semi-structured and unstructured data — while maintaining an organization’s existing policy/access rules around the data.
Deep Learning as a Service now includes a unique Neural Network Modeler. Neural Network Modeler is an intuitive drag-and-drop interface that enables a non-programmer to speed up the model-building process by visually selecting, configuring, designing and auto-coding their neural network using the most popular deep learning frameworks.
Automating Processes to Reduce Complexity
We’ve also abstracted out the complex, time-intensive and costly parameter optimization and training process. This Deep Learning as a Service is an experiment-centric model training environment, meaning users don’t have to worry about getting bogged down with planning and managing training runs themselves. Instead, the entire training life-cycle is managed automatically and the results can be viewed in real-time and revisited later. Each training run is automatically started, monitored, and stopped upon completion, saving users time and money as they only pay for the resources they use.
The new feature also dramatically simplifies the often-arduous process of hyper-parameter selection. Instead of selecting hyper-parameters based on intuition, hyperparameter optimization provides an objective and automated method of exploring a complex problem-space. This results in a higher likelihood of identifying a more ideal model than most data scientists could find using traditional methods, like grid search. Therefore, users spend less time on experiments with little valuable output and more time developing even more sophisticated and powerful neural networks.
In addition, to accelerate experimentation for large training jobs, distribution across multiple machines and GPUs is critical in handling the large amount of training data for complex neural networks. The Deep Learning capability in Watson Studio is built upon IBM’s distributed deep learning technology and the latest open source framework technologies, handling compute across many servers, each with multiple GPUs.
Furthermore, to develop a vibrant community around deep learning fabric, IBM is open sourcing the Fabric for Deep Learning (pronouncedFfDL), the core of Deep Learning as a Service. Leveraging the power of Kubernetes, FfDL provides a scalable, resilient, and fault-tolerant deep-learning framework. The platform uses a distribution and orchestration layer that facilitates learning from a large amount of data in a reasonable amount of time across compute nodes.
New Plans for Watson Studio
The new IBM Watson Studio offering is available on a middle-tier Standard Plan at $99/user/month, which focuses on those teams who need flexible compute and a range of data science and machine learning tools at an affordable price. Other Lite and Enterprise Plans are listed here. More information about new IBM Watson Studio can be found here or in the Watson section of the IBM Cloud Catalog.