Using OpenShift Lightspeed with ARO
This content is authored by Red Hat experts, but has not yet been tested on every supported configuration.
This guide walks through setting up OpenShift Lightspeed backed by Azure AI Foundry services for the LLM.
Prerequisites
- An ARO Cluster is already installed.
- Permissions to use/register Microsoft Cognitive Services
Command line tools used in this guide:
- aws cli
- jq
Set up environment
Create environment variables.
Install the necessary operator, OpenShift Lightspeed Operator from OpenShift console.
Prepare the LLM for serving OpenShift Lightspeed
Create the Azure Cognitive Service.
Create the deployment. In this case we will use the model gpt-4o-mini.
Validate the deployment exists.
Retrieve the endpoint’s url and api-token.
We must use an app registration to get access to the AI service.
Prepare the data, since this must be encoded base 64.
OpenShift LightSpeed Configuration
Create the secret with the credentials and the OLSConfig.
We should wait a couple of minutes until the lightspeed pod server is running.
Test.
To test, try querying the OpenShift Lightspeed in ARO.

Cleanup
Delete OpenShift configurations
Uninstall the OpenShift Lightspeed Operator from the Operator Hub in the OpenShift console.
Delete the cognitive services account and the service principal