Skip to content

Filter Logs

In this recipe, we'll walk through how to filter noisy logs from your data pipeline.

Guide setup

This guide assumes you have created a Datable.io account.

In this example, we will assume the log data in our current transformation step has gone through a standardization process to align it with the OpenTelemetry specification.

Sample Code

First, we create a new pipeline, or a new transformation step if we're adding to an existing pipeline.

You will see the following pre-populated code in your transform step:

javascript
/***
* You have access to the following inputs:
*  - `metadata`:  { timestamp, datatype }
*    -> datatype is a string, and can be 'logs' or 'traces'
*  - `record`:    { resource, body, ... }
*/

// These are the key attributes of an opentelemetry formatted record
const { attributes, resource, body } = record
const { timestamp, datatype } = metadata

// Here we only allow records tagged as 'logs' to pass through,
if (datatype !== 'log') return null
/***
* You have access to the following inputs:
*  - `metadata`:  { timestamp, datatype }
*    -> datatype is a string, and can be 'logs' or 'traces'
*  - `record`:    { resource, body, ... }
*/

// These are the key attributes of an opentelemetry formatted record
const { attributes, resource, body } = record
const { timestamp, datatype } = metadata

// Here we only allow records tagged as 'logs' to pass through,
if (datatype !== 'log') return null

Filter debug logs

Debug logs can create a lot of noise in your telemetry, providing no value while adding to storage and compute bills.

To remove debug logs from our data stream, we use the following code:

javascript
if (datatype !== 'log') return null

if (record.severityText === "DEBUG") return null

return record;
if (datatype !== 'log') return null

if (record.severityText === "DEBUG") return null

return record;

And just like that, we've filtered out our debug logs! Check out our recipe on sampling logs to see more sophisticated filtering logic in action.