This is a guest post written by Munishpal Makhija, VMware - Senior Member of Technical Staff, Dean Lewis, VMware - Cloud Solutions Architect, and Christian Heidenreich, Red Hat - Product Manager.

In this blog post, we will cover how to configure Red Hat OpenShift Logging to forward logs from the ClusterLogging instance to an external third-party system, in this case, VMware vRealize Log Insight Cloud.

Architecture

The Red Hat OpenshiftLogging solution will have to be configured for accessing the logs and forwarding to third-party logging tools. Red Hat OpenShift Logging provides functionalities to collect, forward, store, and provide a user interface into log data for debugging, event corroboration, and similar purposes. Under the hood, it bundles the following technologies:

  • FluentD - provides collecting and forwarding capabilities
  • Elasticsearch - provides storage capabilities keeping logs for a specific period of time
  • Kibana - provides exploration capabilities to search for specific logs

However, to ship the logs to an external system, you will only need to configure the FluentD service.

To forward the logs from the internal trusted services, we will use the new Log Forwarding API, which is GA in OpenShift 4.6 and later.

This setup will provide us with the architecture below. We will deploy the trusted namespace “openshift-logging” and use the Operator to provide a Log Forwarding API configuration that sends logs to a third-party service.

For vRealize Log Insight Cloud, we will run a standalone FluentD instance inside of the cluster to forward to the cloud service. However, it is also a supported architecture to run the FluentD instance external to the cluster. 

The log types are one of the following:

  • Application - Container logs generated by user applications running in the cluster, except infrastructure container applications.
  • Infrastructure - Container logs from pods that run in the openshift*, kube*, or default projects and journal logs sourced from the node file system.
  • audit - Logs generated by the node audit system (auditd) and the audit logs from the Kubernetes API server and the OpenShift API server.

vRealize LogInsight Cloud Overview

VMware vRealize® Log Insight Cloud™ offers IT teams unified visibility across private, hybrid, and native public clouds by adding structure to unstructured log data, providing intuitive dashboards, and leveraging machine learning for faster troubleshooting. For more details, click here

Prerequisites

  • VMware vRealize Log Insight Cloud instance setup with Administrator access
  • Red Hat OpenShift Cluster v4.6 or later deployed (Install Guide)
    • with outbound connectivity for containers
  • Download the configuration file bundle from this VMware Code Repository

https://code.vmware.com/samples/7515/how-to-configure-red-hat-openshift-to-forward-logs-to-vmware-vrealize-log-insight-cloud

As per the above diagram, we will create a namespace and deploy a FluentD service inside the cluster. This will handle the logs forwarded from the OpenShift Logging instance and send them to the Log Insight Cloud instance.Deploy the Standalone FluentD Instance to Forward Logs to vRealize Log Insight Cloud

Creating a vRealize Log Insight Cloud API Key

First, we will create an API key for sending data to our cloud instance. Click here

  1. Expand Configuration on the left-hand navigation pane.
  2. Select “API Keys.”
  3. Click the “New API Key” button.

Give your API key a suitable name and click “Create.”

 

You will be given your API URL and Key in a dialog box, but you will  also be able to see the Key in the list as well.

The API URL will be different for each region. The provided fluent.conf includes the URL for the U.S. region. Please be sure to update the fluent.conf file for your region. 

 

Move into your git repo folder from the prerequisites listed above and edit the fluent.conf file to add in your API key.

At this point, you will  also want to add in your Cluster-ID. This can be a name for the cluster you choose, or you can get the OpenShift Cluster-ID using the following command:

oc get clusterversion -o jsonpath='{.items[].spec.clusterID}{"\n"}'

Deploy FluentD Standalone Instance to the OpenShift Cluster

Log onto your OpenShift cluster via your terminal, and deploy the namespace:

oc apply -f openshift_vrealize_loginsight_cloud/01-vrli-fluentd-ns.yaml

This will create a namespace (a.k.a OpenShift project) called “vmware-system-vrlic.”

Create a configmap for the FluentD Container Service to allow it to receive files from the Cluster Logging instance, and forward it using the FluentD vmware_log_intelligence protocol.

oc create configmap vrlicloud-fluent-config --from-file=openshift_vrealize_loginsight_cloud/fluent.conf -n vmware-system-vrlic

Create the Fluentd Deployment:

oc create -f openshift_vrealize_loginsight_cloud/02-vrli-fluentd-deployment.yaml

Once deployed, take a note of the Cluster-IP assigned to the deployment. We will use this in the Log Forwarding API configuration:

oc get svc -n vmware-system-vrlic

Installing and Configuring the ClusterLogging Instance

On your OpenShift cluster, we need to configure the ClusterLogging instance that  will deploy the services needed in a privileged namespace, where the services will have the correct security contexts to access the system.

In this blog, we will continue the setup using the OpenShift Web Console; alternatively, to use the CLI , please follow the OpenShift documentation

In the Web Console, go to the OperatorHub page.

Filter by using the keyword “logging” and install Red Hat Logging Operator. 

Install the Logging Operator using the default settings. This will create the privileged “openshift-logging” namespace if it does not already exist.

Go to the Installed Operators page and click on “Cluster Logging” Operator:

We will now create a Cluster Logging instance; in this blog post, I will only deploy the FluentD. To deploy the full OpenShift monitoring suite, please see the documentation.

Click the Red Hat OpenShift Logging Tab, and then Create ClusterLogging blue button:

Change your view to YAML, and edit the YAML file to what is shown below:

apiVersion: logging.openshift.io/v1
kind: ClusterLogging
metadata:
 namespace: openshift-logging
 name: instance
spec:
 collection:
   logs:
     fluentd: {}
     type: fluentd
 managementState: Managed

Click create, and this will deploy the FluentD pods.

Create the Log Forwarding API Configuration

On your Cluster Logging Operator Page, click the “Cluster Log Forwarder Tab,” then the “Create ClusterLogForwarder” blue button.

Change to YAML view and provide the configuration as below, substituting your Cluster-IP of the Standalone FluentD deployment we created earlier:

apiVersion: logging.openshift.io/v1
kind: ClusterLogForwarder
metadata:
 name: instance
 namespace: openshift-logging
spec:
 outputs:
   - name: fluentd-server-insecure
     type: fluentdForward
     url: 'tcp://<CLUSTER-IP>:24224'
 pipelines:
   - inputRefs:
       - application
       - infrastructure
       - audit
     name: to-vrlic
     outputRefs:
       - fluentd-server-insecure

The spec is broken down into two sections. Your Output configuration  is where you are going to send your logs. You are able to have multiples of these.

The second part is your Pipeline configuration, which has an input of which logs you are interested in forwarding (inputRefs), and which Output configurations will be used (outputRefs). You can find further examples within the OpenShift documentation here.

Click “Create,” and soon you should have Logs from your OpenShift environment hitting your vRealize Log Insight Cloud Instance:

You can also filter the search by the filters set in your fluentd.conf file. Below I have  searched for environment type “openshift,” but in the log, you can see my Cluster-ID that I have  manually set, as well as the log type being identified as Kubernetes.

You can get the Sample Dashboard for Openshift from here

By using vRealize Log Insight Cloud with Red Hat OpenShift, you can consume a powerful log analytics engine that is easy to integrate into your platform. Getting started is quick and simple, and for customers of VMware Cloud on AWS, you will already have access to vRealize Log Insight Cloud today which can be used for the solution discussed in this blog post. 

For More Information

Getting Started with vRealize LogInsight Cloud

For a free trial, you can click here or reach out to your account team. 

To learn more about vRealize Log Insight Cloud, you can view the following videos below and visit the product page here.