Contact

Contact

Tech Topic

AI/ML on Red Hat OpenShift 

Red Hat Accelerates AI/ML Workflows and Delivery of AI-Powered
Intelligent Applications with self-managed Red Hat OpenShift, or our AI/ML cloud service

What is an ML lifecycle?

A multi-phase process to obtain the power of large volumes and variety of data, abundant compute, and open source machine learning tools to build intelligent applications.

At a high level, there are four steps in the lifecycle:

  1. Gather and prepare data to make sure the input data is complete, and of high quality
  2. Develop model, including training, testing, and selection of the model with the highest prediction accuracy
  3. Integrate models in application development process, and inferencing
  4. Model monitoring and management, to measure business performance and address potential production data drift

Why deploy AI/ML (Artificial Intelligence & Machine Learning) workloads on Red Hat OpenShift?

Red Hat OpenShift Data Science

Red Hat® OpenShift® Data Science is a managed cloud service for data scientists and developers of intelligent applications. It provides a fully supported sandbox in which to rapidly develop, train, and test machine learning (ML) models in the public cloud before deploying in production.
 

Key challenges facing data scientists

Data scientists are primarily responsible for ML modeling to ensure the selected model continues to provide the highest prediction accuracy.

The key challenges data scientists face are:

  1. Selecting & deploying the right ML tools (ex. Apache Spark, Jupyter notebookTensorFlow, PyTorch, etc.)
  2. Complexities and time required to train, test, select, and retrain the ML model that provides the highest prediction accuracy
  3. Slow execution of modeling and inferencing tasks because of lack of hardware acceleration
  4. Repeated dependency on IT operations to provision and manage infrastructure
  5. Collaborating with data engineers and software developers to ensure input data hygiene, and successful ML model deployment in app dev processes

Why use containers and Kubernetes for your machine learning initiatives?

Containers and Kubernetes are key to accelerating the ML lifecycle as these technologies provide data scientists the much needed agility, flexibility, portability, and scalability to train, test, and deploy ML models.

Red Hat® OpenShift® is the industry's leading containers and Kubernetes hybrid cloud platform. It provides all these benefits, and through the integrated DevOps capabilities (e.g. OpenShift Pipelines, OpenShift GitOps, and Red Hat Quay) and integration with hardware accelerators, it enables better collaboration between data scientists and software developers, and accelerates the roll out of intelligent applications across hybrid cloud (data center, edge, and public clouds).

 

Red Hat OpenShift Data Science

Red Hat OpenShift Data Science is a managed cloud service for data scientists and developers to rapidly develop, train, and test containerized machine learning (ML) models in the public cloud before deploying in production.

 

Benefits of Red Hat OpenShift for ML initiatives

Self-service, consistent, cloud experience for data scientists across the hybrid cloud

  • Empower data scientists with the flexibility and portability to use the containerized ML tools of their choice to quickly build, scale, reproduce, and share ML models.
  • Use the most relevant ML tools via Red Hat certified Kubernetes Operators for both self-managed and our AI cloud service option.
  • Eliminate dependency on IT to provision infrastructure for iterative, compute-intensive ML modeling tasks.
  • Eliminate “lock-in” concerns with any particular cloud provider, and their menu of ML tools.
  • Tight integration with CI/CD tools allows ML models to be quickly deployed iteratively, as needed.

Accelerate compute-intensive ML modeling & inferencing jobs

Integrations with popular hardware accelerators such as NVIDIA GPUs via Red Hat certified GPU operator means that OpenShift can seamlessly meet the high compute resource requirements to help select the best ML model providing the highest prediction accuracy, and ML inferencing jobs as the model experiences new data in production.

Streamline development of intelligent applications

Extending OpenShift DevOps automation capabilities to the ML lifecycle enables collaboration between data scientists, software developers, and IT operations so that ML models can be quickly integrated into the development of intelligent applications. This helps boost productivity, and simplify lifecycle management for ML powered intelligent applications.

  • Building from the container model images registry with OpenShift Build.
  • Continuous, iterative development of ML model powered intelligent applications with OpenShift Pipelines.
  • Continuous deployment automation for ML models powered intelligent applications with OpenShift GitOps.
  • An image repository to version model container images and microservices with Red Hat Quay.

Key use cases for machine learning on Red Hat OpenShift

OpenShift is helping organizations across various industries to accelerate business and mission critical initiatives by developing intelligent applications in the hybrid cloud. Some example use cases include fraud detection, data driven diagnostics and cure, connected cars, autonomous driving, oil and gas exploration, automated insurance quotes, and claims processing.

Red Hat Decision Manager to build Intelligent Applications 

Red Hat Decision Manager is a cloud-native business rules and decisioning platform that allows ML models to be integrated with decision models. These models can then be served and made available for inference as microservices on OpenShift. Integration with monitoring tools like Prometheus and Grafana enables monitoring and management of the (business) performance of ML models in production.

For additional information on Red Hat Decision Manager, please consult the Red Hat Decision Manager product page, or visit the Red Hat Developer site.

Red Hat Data Services for data management in the ML lifecycle

Red Hat Data Services was built to address petabyte-scale storage requirements in the ML lifecycle, from data ingestion and preparation, ML modeling, to the inferencing phase. Included in the Red Hat Data Services portfolio is Red Hat Ceph Storage, an open source software defined storage system which provides comprehensive support for S3 object, block, and file storage, and delivers massive scalability on industry standard commodity hardware.

For example, you can present scalable Ceph storage to containerized Jupyter notebooks on OpenShift via S3 or persistent volumes.

Open Data Hub Project to build a complete ML platform

Open Data Hub Project is a functional architecture based on OpenShift, Red Hat Ceph Storage, Red Hat AMQ Streams, and several upstream open source projects to help build an open ML platform with the necessary ML tooling.

For additional information on the Open Data Hub project, read the blogs, and get started here.