Cloud Experts Documentation

Bedrock

Building LLM Cost and Performance Dashboard with Red Hat OpenShift AI on ROSA and Amazon Bedrock

1. Introduction As the LLM’s usage increases in the enterprise, not many realize that every LLM API call has two hidden costs: time and money. So while data scientists might argue about data accuracy, infrastructure engineers on the other hand, would need to know if that 2-second response time will scale, and if those $0.015 per thousand tokens cost will blow their quarterly budget, among others. In this guide, we will build a simple cost and performance dashboard for Amazon Bedrockexternal link (opens in new tab) models using Red Hat OpenShift AI (RHOAI) , which is our platform for managing AI/ML projects lifecycle, running on a Red Hat OpenShift Service on AWS (ROSA) cluster.

Interested in contributing to these docs?

Collaboration drives progress. Help improve our documentation The Red Hat Way.

Red Hat logo LinkedIn YouTube Facebook Twitter

Products

Tools

Try, buy & sell

Communicate

About Red Hat

We’re the world’s leading provider of enterprise open source solutions—including Linux, cloud, container, and Kubernetes. We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

Subscribe to our newsletter, Red Hat Shares

Sign up now
© 2023 Red Hat, Inc.