Filebeat 
Datable integrates with Filebeat to collect, parse, and forward log data from your servers and applications. Filebeat is a lightweight shipper for forwarding and centralizing log data.
How it works 
Filebeat monitors specified log files or locations, collects log events, and forwards them to Datable using the Elasticsearch output protocol. The integration supports various log formats and can handle multiline events, JSON parsing, and field processing.
Prerequisites 
- Filebeat 7.x or 8.x installed on your systems
- Network connectivity to Datable endpoints
- Access to modify Filebeat configuration
Setup Instructions 
Step 1: Install Filebeat 
Choose your platform and install Filebeat.
Step 2: Configure your source in Datable 
- Navigate to the Sources page in Datable
- Select Filebeat from available sources
- Enter a source name
- Follow the instructions to generate a config for your Filebeat instance
Step 3: Update Filebeat configuration and restart 
- Open your filebeat.ymlconfiguration file
- Update your configuration based on the snippet provided in the onboarding instructions. Combining your existing configuration with the new settings may be necessary.
- Restart Filebeat to apply the changes.
Input Types 
Filestream Input 
Modern input for reading log files:
yaml
- type: filestream
  enabled: true
  paths:
    - /var/log/*.log
  prospector.scanner.exclude_files: ['.gz$']
  parsers:
    - multiline:
        pattern: '^\['
        negate: true
        match: afterContainer Input 
For Docker container logs:
yaml
- type: container
  paths:
    - '/var/lib/docker/containers/*/*.log'
  processors:
    - add_docker_metadata:
        host: "unix:///var/run/docker.sock"TCP/UDP Input 
For receiving logs over network:
yaml
- type: tcp
  enabled: true
  host: "localhost:9000"
  max_message_size: 10MiBCommon Use Cases 
Application Logs 
yaml
- type: filestream
  enabled: true
  paths:
    - /var/log/app/*.log
  fields:
    app_name: myapp
    environment: production
  processors:
    - decode_json_fields:
        fields: ["message"]
        target: ""
        overwrite_keys: trueNginx Logs 
yaml
- type: filestream
  enabled: true
  paths:
    - /var/log/nginx/access.log
  processors:
    - dissect:
        tokenizer: '%{clientip} - - [%{timestamp}] "%{verb} %{request} HTTP/%{httpversion}" %{response} %{size}'
        field: "message"
        target_prefix: "nginx"Kubernetes Logs 
yaml
- type: container
  paths:
    - /var/log/containers/*.log
  processors:
    - add_kubernetes_metadata:
        host: ${NODE_NAME}
        matchers:
        - logs_path:
            logs_path: "/var/log/containers/"Troubleshooting 
Filebeat Not Starting 
- Check configuration syntax: filebeat test config
- Verify file permissions
- Review Filebeat logs: /var/log/filebeat/filebeat.log
No Data in Datable 
- Verify API key is correct
- Check network connectivity: curl -k https://YOUR_DATABLE_HOST:5201
- Ensure log files exist and are readable
- Review Filebeat metrics
High Memory Usage 
- Adjust memory limits:
yaml
queue.mem:
  events: 2048
  flush.min_events: 512
  flush.timeout: 5sDuplicate Events 
- Check for multiple Filebeat instances
- Review registry file: /var/lib/filebeat/registry
- Ensure unique paths in inputs
Performance Tuning 
Batch Settings 
yaml
output.elasticsearch:
  bulk_max_size: 2048
  worker: 2
  compression_level: 3Memory Management 
yaml
max_procs: 2
queue.mem:
  events: 4096
  flush.min_events: 1024File Handle Limits 
yaml
filebeat.inputs:
  - type: filestream
    close_inactive: 5m
    close_removed: true
    clean_removed: true
    harvester_limit: 100Security Configuration 
Secrets Management 
Use environment variables:
yaml
output.elasticsearch:
  headers:
    datable-api-key: ${DATABLE_API_KEY:default_value}Or keystore:
bash
filebeat keystore create
filebeat keystore add DATABLE_API_KEYMonitoring 
Enable Monitoring 
yaml
monitoring.enabled: true
monitoring.elasticsearch:
  hosts: ["{{host}}:5201"]Metrics Endpoint 
yaml
http.enabled: true
http.host: localhost
http.port: 5066Best Practices 
- Use Filestream Input: Prefer over deprecated loginput
- Enable Compression: Reduces network bandwidth
- Set Resource Limits: Prevent resource exhaustion
- Use Processors Wisely: Balance between client and server processing
- Monitor Registry Size: Clean up old entries periodically
- Implement Dead Letter Queue: Handle failed events
- Use Field Mappings: Ensure consistent data types
Support 
For additional support with the Filebeat integration, please contact the Datable support team or refer to the Filebeat documentation.