Skip to content

Quick Start

Datable supports a wide range of data sources and destinations. For this quick start, we will ingest data from a Datadog agent, write a simple transformation, and forward the transformed data to Datadog. If you don't have a Datadog agent to configure, don't worry - Datable comes with demo data. Just skip the first step.

Forward data from a Datadog agent

Datadog data can be imported into Datable by modifying the Datadog agent configuration, either in the configuration file or via an environment variable.

Environment variables

yaml
DD_LOGS_CONFIG_DD_URL: YOUR_DATABLE_HOST.dtbl.io:5301
DD_APM_DD_URL: YOUR_DATABLE_HOST.dtbl.io:5301
DD_LOGS_CONFIG_DD_URL: YOUR_DATABLE_HOST.dtbl.io:5301
DD_APM_DD_URL: YOUR_DATABLE_HOST.dtbl.io:5301

YAML configuration

To configure these settings in the Datadog agent configuration file, add/modify the following values, ensuring that any parent keys are also uncommented and not duplicated.

yaml
logs_config:
	logs_dd_url: YOUR_DATABLE_HOST.dtbl.io:5301
apm_config:
	apm_dd_url: YOUR_DATABLE_HOST.dtbl.io:5301
logs_config:
	logs_dd_url: YOUR_DATABLE_HOST.dtbl.io:5301
apm_config:
	apm_dd_url: YOUR_DATABLE_HOST.dtbl.io:5301

Create a pipeline

  1. Navigate to the "Pipelines" page via the left-hand sidebar. This is the directed acyclic graph (DAG) builder interface, where you can add transformation and export steps to a pipeline.
  2. Click the "New pipeline" button

Demo data is automatically added to the pipeline. Now, you can begin writing functions to transform your data.

Write a transformation

  1. Click the "Code" node with the "logs" marker to enter the transformation step code editor.

The code editor pre-populates the following code:

javascript

/***
* You have access to the following inputs:
*  - `metadata`:  { timestamp, datatype }
*    -> datatype is a string, and can be 'logs' or 'traces'
*  - `record`:    { resource, body, ... }
*/

// These are the key attributes of an opentelemetry formatted record
const { attributes, resource, body } = record
const { timestamp, datatype } = metadata

// Here we only allow records tagged as 'logs' to pass through,
// we set the return value to null, we effectively filter out non-log data. 
if (datatype !== 'log') return null
  
return record

/***
* You have access to the following inputs:
*  - `metadata`:  { timestamp, datatype }
*    -> datatype is a string, and can be 'logs' or 'traces'
*  - `record`:    { resource, body, ... }
*/

// These are the key attributes of an opentelemetry formatted record
const { attributes, resource, body } = record
const { timestamp, datatype } = metadata

// Here we only allow records tagged as 'logs' to pass through,
// we set the return value to null, we effectively filter out non-log data. 
if (datatype !== 'log') return null
  
return record

From here, let's add a line of code to remove "INFO" level logs.

javascript
const { attributes, resource, body } = record
const { timestamp, datatype } = metadata

if (datatype !== 'log') return null
if (record.severityText === 'INFO') return null
return record
const { attributes, resource, body } = record
const { timestamp, datatype } = metadata

if (datatype !== 'log') return null
if (record.severityText === 'INFO') return null
return record

Info-level logs will now be dropped before heading downstream. Click "Save" to save your changes.

Add a destination

  1. Click the "Add step" button that appears after the log transformation step. // image
  2. Click "Send to" in the right-hand nav. // image
  3. From here, you can select your destination of choice and fill out the requiste information. For demo data, Click on "HTTP". // image
  4. For demo data, add the following URL in the "Logs URL": www.example.com // image

Click "Save" to save this step change.

Save the pipeline

Click the "Save pipeline" button at the top of the DAG editor. Add your pipeline name and click "Save and enable".

And that's it! Your data is flowing to your choice of destination, without INFO level logs.

While this example is simple, Datable supports any JavaScript transformations, regardless of complexity.