Creating agentic AI to deploy ARO cluster using Terraform with Red Hat OpenShift AI on ARO and Azure OpenAI

Learn how to create an artificial intelligence (AI) that can deploy Microsoft Azure Red Hat® OpenShift® clusters using Terraform. Explore prompt-based infrastructure and how it can be used to maintain clusters in your own environment.

Learn how to create an artificial intelligence (AI) that can deploy Microsoft Azure Red Hat® OpenShift® clusters using Terraform. Explore prompt-based infrastructure and how it can be used to maintain clusters in your own environment.

Prerequisites for creating agentic AI to deploy ARO clusters using Terraform

5 mins

Agentic AI may seem like an enterprise-sized concept, but for the job we’re trying to accomplish, it can be a powerful tool. By using it in tandem with Terraform, Azure OpenAI, and Red Hat® OpenShift® AI (RHOAI), agentic AI can become another asset in your organization’s overall infrastructure expansion and health.

Terraform is an automation tool, sometimes referred to as an Infrastructure as Code (IaC) tool, that allows us to provision infrastructure using declarative configuration files. The agentic AI in this guide will provision those clusters based on our MOBB Terraform repository for ARO. Here it runs on Red Hat OpenShift AI, which is our platform for managing AI/ML projects lifecycle, and you will be using the GPT-4o mini model via Azure OpenAI Foundry.  

In short, the objective of this guide is to introduce you to Prompt-based Infrastructure or perhaps, Text-to-Terraform. The agentic AI you are creating will be able to deploy (and destroy) Azure Red Hat OpenShift clusters based on users' prompts such as whether it is private/public, which region, what types of worker nodes, number of worker nodes, which cluster version, and so forth.

There will be a brief specification of the prompts' parameters in the relevant sections, as well as a highlight of the differences between the default parameters in this guide and in the Terraform repository.

Note that since real deployment could be costly, this guide uses a preset  simulator test with a mock toggle that you can set to True for mock results and False for real cluster deployment.

What will you learn?

  • Ensuring your environment is ready for the Agentic AI implementation

Prerequisites

There are three major items needed before proceeding with this guide:

An Azure Red Hat OpenShift (ARO) cluster

(>= version 4.16)
You can deploy it manually or using Terraform. This guide in particular was tested on ARO 4.17.27 with Standard_D16s_v3 instance size for both the control plane and the worker nodes.

Note that you would not need a GPU for this guide. 

Azure OpenAI model 

You could use any model that you like to be used for the parser. However, since we are running the notebook on an ARO cluster, we are leveraging the Azure OpenAI service, and in this case we are using GPT-4o mini for a lightweight and cost-efficient alternative. Please refer to the Bonus section in this tutorial to deploy it, and be sure to create GPT-4o mini deployment instead of GPT-4 per that tutorial.

RHOAI operator

You can install this operator using the console per Section 3 in this tutorial or using CLI per Section 3 in this tutorial. Once you have the operator installed, be sure to install the DataScienceCluster instance, wait for a few minutes for the changes to take effect, and then launch the RHOAI dashboard for the next step. This tutorial was tested on RHOAI version 2.19.1.

Previous resource
Overview: Creating agentic AI to deploy ARO cluster using Terraform with Red Hat OpenShift AI on ARO and Azure OpenAI
Next resource
Setup

This learning path is for operations teams or system administrators

Developers may want to check out Getting started with Red Hat OpenShift AI on developers.redhat.com. 

Get started on developers.redhat.com

Hybrid Cloud Logo LinkedIn YouTube Facebook Twitter

Platforms

Tools

Try, buy, sell

Communicate

About Red Hat

We’re the world’s leading provider of enterprise open source solutions—including Linux, cloud, container, and Kubernetes. We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

© 2025 Red Hat