Skip to main content

Collecting one-time event data with Deephaven and Prometheus

· 4 min read
Prometheus stealing fire from a mountain, the mountain is made of piles of computers, hard drives and data servers, artstation
Jake Mulford
Integrating Deephaven and Prometheus Part 2: Ingesting Webhooks

In our previous Prometheus post, we discussed how to pull data from Prometheus via its REST API into Deephaven. This allows us to display trends of data from Prometheus over time. This works very well for collecting our metrics.

In this post, we consider how to similarly track our pre-defined Prometheus alerts.

img

We could add our alert definitions to the data pulling logic, but then we end up duplicating those definitions. That's not ideal. And our precision will be dependent on how often we pull data, which means our alert timestamps may not be accurate.

By looking at Prometheus's API documentation, we can find the alerts API. At a first glance, this may appear to be a good solution. However, polling the alerts API only lists active alerts, without any knowledge of resolutions. Because the API does not provide details on what happened between data pulls, an alert could be fired and resolved between pulls.

Pulling data just doesn't seem to be the right solution for this problem. So what can we do instead? It turns out that Prometheus supports webhook alerts, allowing Prometheus to send HTTP notifications to our server. These can be used to get real-time notifications regarding our pre-defined Prometheus alerts. We can deserialize the webhook's data, and store it within Deephaven without needing to duplicate our alert definitions and without needing to pull data!

Come back later for a follow-up on how we can join Prometheus data for our metrics and alerts!

Webhook data ingestion

As described in the previous Prometheus post, Deephaven's DynamicTableWriter class is one option for real-time data ingestion. We know how to pull data from Prometheus's REST API, and now we will go over how to do this with data that comes in through a webhook.

To ingest the Prometheus webhook data, we create a web server route that can receive the webhook. When the webook event is received, the server deserializes the data and writes it to Deephaven.

In our case, we use a Python Flask web server to handle the webook, and we use Deephaven's Python client to write the data to a Deephaven table. Prometheus alert webhook schema provide a lot of data that we can grab. For this example, we only care about the alert objects defined in the alerts list of the webhook. For every alert, we grab the alert status, time, job, instance, and identifier. This data is formatted, and then written to Deephaven using the session.run_script() method.

@app.route('/', methods=['POST'])
def receive_alert():
request_json = request.json
date_time_string = None
job = None
instance = None
alert_identifier = None
status = None

#For every alert, build the method call to update the alerts table
for alert in request_json["alerts"]:
status = alert["status"]

#Dates come in the format yyyy-mm-ddThh:mm:ss.sssZ, we need to
#convert to yyyy-mm-ddThh:mm:ss.sss TZ for Deephaven
if status == "firing":
date_time_string = alert["startsAt"][0:-1] + " UTC"
elif status == "resolved":
date_time_string = alert["endsAt"][0:-1] + " UTC"

job = alert["labels"]["job"]
instance = alert["labels"]["instance"]
alert_identifier = alert["labels"]["alertname"]

#Executes the alert table update in Deephaven
session.run_script(f'update_alerts_table("{date_time_string}", "{job}", "{instance}", "{alert_identifier}", "{status}")')
return "Request received"

Now that we've received the webhook, we need to deserialize it and grab the data we want. For this example, we care about the alert objects defined in the alerts list of the webhook.

This code grabs the webhook data (request_json = request.json), and then loops through the alerts list (for alert in request_json["alerts"]:). For every alert, it grabs the alert status, alert time, alert job, alert instance, and the alert identifier. This data is formatted, and then written to Deephaven using the session.run_script() method.

This allows us to accept the webhooks from Prometheus, and push the data to our table in Deephaven! With the proper configuration of Prometheus AlertManager rules, we can use this server to receive notifications when our alerts are firing, and when they are resolved.

Sample app

The Prometheus alerts sample app demonstrates how to ingest webhook data from Prometheus and store it in Deephaven. There are many different ways to gather data via webhooks, and this project demonstrates just one example of that. Deephaven's real-time data analysis capabilities can be applied to this data, furthering what you can do with your data and Deephaven.

This project is available to everyone, so feel free to run this locally and modify the alert rules, data tables, or any other configuration to see different things you can accomplish using Deephaven.