Datadog Integration

Datadog Integration

Datadog Integration Guide

Forward Keycloak authentication events and logs to Datadog for modern observability, security monitoring, and unified infrastructure visibility.

Overview

This guide shows you how to configure Skycloak to send security events to Datadog using the HTTP Logs API. Datadog is ideal for organizations wanting unified observability across application, infrastructure, and security data.

What You’ll Achieve

  • Real-time authentication event streaming to Datadog
  • Unified security and application monitoring
  • Custom dashboards and monitors
  • Machine learning-powered anomaly detection

Prerequisites

Before starting, ensure you have:

  1. Skycloak Requirements

    • Enterprise plan with SIEM Integration add-on
    • Workspace Owner or Admin permissions
  2. Datadog Requirements

    • Active Datadog account (Pro or Enterprise plan recommended)
    • API key with Logs Write permissions
    • Access to Datadog Logs Explorer

Step 1: Get Datadog API Key

Create or Get Existing API Key

  1. Log in to Datadog
  2. Navigate to Organization Settings (bottom left) → API Keys
  3. Option A - Use existing key with logs permissions
  4. Option B - Create new key:
    • Click New Key
    • Name: Skycloak Keycloak Events
    • Click Create Key
    • Copy and save the API key (won’t be shown again)

Determine Datadog Site

Your Datadog site determines the API endpoint URL:

Site Region API Endpoint
US1 (default) US https://http-intake.logs.datadoghq.com
US3 US https://http-intake.logs.us3.datadoghq.com
US5 US https://http-intake.logs.us5.datadoghq.com
EU1 Europe https://http-intake.logs.datadoghq.eu
AP1 Asia Pacific https://http-intake.logs.ap1.datadoghq.com

Find your site in Datadog: Check the URL (e.g., https://app.datadoghq.com = US1, https://app.datadoghq.eu = EU1)

Step 2: Configure Skycloak SIEM Destination

  1. Log in to your Skycloak dashboard
  2. Navigate to SIEM Integration in the left sidebar
  3. Click Add Destination
  4. Configure the destination:

Basic Information

  • Name: Datadog Production
  • Destination Type: HTTP Webhook

Connection Details

Webhook URL: Use your Datadog site endpoint with API key

https://http-intake.logs.datadoghq.com/api/v2/logs?dd-api-key=YOUR_API_KEY

Replace:

  • datadoghq.com with your site domain (e.g., datadoghq.eu for EU1)
  • YOUR_API_KEY with your actual API key

Authentication Type: None (API key is in URL)

Custom Headers: Add this JSON:

{
  "Content-Type": "application/json",
  "DD-EVP-ORIGIN": "skycloak",
  "DD-EVP-ORIGIN-VERSION": "1.0"
}

Event Filtering

Select events to forward:

  • LOGIN - Successful logins
  • LOGIN_ERROR - Failed login attempts
  • LOGOUT - User logouts
  • UPDATE_PASSWORD - Password changes
  • UPDATE_EMAIL - Email changes
  • REGISTER - New user registrations
  • CODE_TO_TOKEN - Token exchanges
  • REFRESH_TOKEN - Token refreshes

Include Server Logs: Enable for detailed application logs (increases ingestion costs)

Batching Configuration

  • Batch Size: 500 (Datadog accepts up to 5MB per request)
  • Batch Interval: 60 seconds (recommended for near-real-time)
  1. Click Complete Setup
  2. Toggle Enable to start forwarding events

Step 3: Verify Data Flow

Check Logs Explorer

  1. Navigate to LogsExplorer

  2. Use this search query:

    source:keycloak service:authentication
  3. Events should appear within 2-3 minutes

Verify Log Attributes

Click on any log entry and verify these attributes are present:

  • event_type - Event name (LOGIN, LOGIN_ERROR, etc.)
  • username - User identifier
  • ip_address - Source IP
  • realm - Keycloak realm
  • timestamp - Event time
  • source - Should show “keycloak”

Test with Login Event

  1. Log in to one of your Keycloak realms
  2. Wait 2-3 minutes for batch processing
  3. Search in Logs Explorer:
    source:keycloak event_type:LOGIN

Step 4: Create Log Pipelines

Set Up Keycloak Log Pipeline

  1. Navigate to LogsPipelines
  2. Click New Pipeline
  3. Configure:
    • Name: Keycloak Events
    • Filter: source:keycloak

Add Processors

Processor 1: Grok Parser (if needed for custom formatting)

rule %{data:event_type} %{data:username} from %{ip:ip_address}

Processor 2: Date Remapper

  • Name: Remap timestamp
  • Attribute: timestamp
  • Check “Preserve source attribute”

Processor 3: Status Remapper

  • Name: Map event type to status
  • Attribute: event_type
  • Mapping:
    • LOGIN_ERRORerror
    • LOGOUT_ERRORerror
    • LOGINinfo
    • LOGOUTinfo
    • Default → info

Processor 4: Category Processor

  • Name: Categorize events
  • Target: category
  • Rules:
    • Contains LOGINauthentication
    • Contains PASSWORDaccount_management
    • Contains EMAILaccount_management
    • Contains REGISTERuser_registration
  1. Click Save

Step 5: Create Monitors

Example: Failed Login Monitor

  1. Navigate to MonitorsNew Monitor
  2. Select Logs
  3. Configure:

Define the search query:

source:keycloak event_type:LOGIN_ERROR

Set alert conditions:

  • Threshold: Above 10 (adjust based on your baseline)
  • Time window: 5 minutes
  • Group by: username, ip_address

Configure notifications:

  • Title: High Failed Login Attempts - Keycloak {{username.name}}
  • Message:
    Multiple failed login attempts detected:
    - Username: {{username.name}}
    - IP Address: {{ip_address.name}}
    - Count: {{value}}
    - Time: {{last_triggered_at}}
    
    @slack-security-team @pagerduty-security
  1. Click Create

Example: New User Registration Monitor

Alert on new user registrations:

Search query:

source:keycloak event_type:REGISTER

Alert conditions:

  • Threshold: Above or equal to 1
  • Evaluation window: 5 minutes
  • Group by: realm

Example: Account Modification Monitor

Alert on privileged account changes:

Search query:

source:keycloak event_type:(UPDATE_PASSWORD OR UPDATE_EMAIL) username:*admin*

Alert conditions:

  • Threshold: Above or equal to 1
  • Evaluation window: 1 minute

Step 6: Create Dashboards

Example: Authentication Overview Dashboard

  1. Navigate to DashboardsNew Dashboard
  2. Name: Keycloak Authentication Monitoring
  3. Add widgets:

Widget 1: Login Success Rate (Query Value)

source:keycloak event_type:(LOGIN OR LOGIN_ERROR)
| stats count by event_type
| eval success_rate = (count{event_type:LOGIN} / (count{event_type:LOGIN} + count{event_type:LOGIN_ERROR})) * 100

Widget 2: Failed Logins Timeline (Timeseries)

source:keycloak event_type:LOGIN_ERROR
| timeseries count by event_type

Widget 3: Top Failed Login Users (Top List)

source:keycloak event_type:LOGIN_ERROR
| top username by count

Widget 4: Login Activity by Realm (Pie Chart)

source:keycloak event_type:LOGIN
| stats count by realm

Widget 5: Geographic Login Distribution (Geomap)

source:keycloak event_type:LOGIN
| stats count by @network.client.geoip.country.name

Widget 6: Recent Account Changes (Log Stream)

source:keycloak event_type:(UPDATE_PASSWORD OR UPDATE_EMAIL OR REGISTER OR DELETE_ACCOUNT)
  1. Click Save

Useful Datadog Queries

Recent Failed Logins

source:keycloak event_type:LOGIN_ERROR
| sort -@timestamp
| limit 100

User Login History

source:keycloak event_type:LOGIN username:john.doe
| sort -@timestamp

Login Activity by Time

source:keycloak event_type:LOGIN
| timeseries count by 1h

Suspicious Multiple IP Logins

source:keycloak event_type:LOGIN
| stats dc(ip_address) as unique_ips by username
| where unique_ips > 3

Account Modification Summary

source:keycloak event_type:(UPDATE_PASSWORD OR UPDATE_EMAIL)
| stats count by event_type, username
| sort -count

Failed Login Rate by Realm

source:keycloak event_type:(LOGIN OR LOGIN_ERROR)
| stats count by realm, event_type
| eval error_rate = (count{event_type:LOGIN_ERROR} / (count{event_type:LOGIN} + count{event_type:LOGIN_ERROR})) * 100
| sort -error_rate

Log Facets

Create Custom Facets

To make Keycloak attributes searchable:

  1. Navigate to LogsExplorer
  2. Find a Keycloak log entry
  3. Click on attribute (e.g., realm, client_id)
  4. Click Create facet for @realm
  5. Configure:
    • Type: String (most common)
    • Facet name: Keycloak Realm
    • Description: Keycloak realm name
    • Group: Select or create “Authentication”

Recommended facets:

  • realm → String facet
  • event_type → String facet
  • username → String facet
  • ip_address → String facet
  • client_id → String facet (from details object)

Integration with APM

Correlate Logs with Traces

If you’re using Datadog APM for your applications:

source:keycloak event_type:LOGIN
| join service:your-app-service on [email protected]

This correlates Keycloak authentication events with your application traces.

Troubleshooting

No Logs Appearing in Datadog

  1. Check Skycloak destination status:

    • Navigate to SIEM Integration page
    • Verify “Healthy” status
    • Check “Events Sent” metric is increasing
  2. Verify API key:

    • Ensure API key is correct and not expired
    • Check key has Logs Write permission
    • Test key manually:
    curl -X POST "https://http-intake.logs.datadoghq.com/api/v2/logs" \
      -H "DD-API-KEY: YOUR_API_KEY" \
      -H "Content-Type: application/json" \
      -d '[{"message":"test","service":"test"}]'
  3. Check Datadog site:

    • Ensure webhook URL matches your Datadog site
    • US: .com, EU: .eu, etc.
  4. Review Datadog Logs status page:

Logs Not Appearing in Search

  • Check filters: Remove all filters and search for source:keycloak
  • Check time range: Extend to “Past 1 Hour”
  • Check log retention: Verify logs are within retention period
  • Verify index: Logs should be in default index

High Ingestion Costs

  1. Filter events: Only forward critical event types
  2. Reduce batch frequency: Increase batch interval to 300 seconds
  3. Disable server logs: Turn off “Include Logs” if not needed
  4. Set up log sampling: Configure sampling rules in Datadog
  5. Use exclusion filters: Exclude noisy events you don’t need

Best Practices

  1. Tag Your Logs: Use consistent tagging strategy (environment, team, service)
  2. Set Up Indexes: Create dedicated index for Keycloak logs if high volume
  3. Configure Retention: Balance retention needs with cost (default: 15 days)
  4. Use Log Archives: Archive to S3 for long-term compliance needs
  5. Create Saved Views: Save common search queries for quick access
  6. Set Up SLOs: Create Service Level Objectives for authentication success rate
  7. Enable Log Rehydration: For investigating historical incidents

Next Steps

  • Create additional monitors for your specific security requirements
  • Build comprehensive dashboards for your security team
  • Set up log-based metrics for long-term trending
  • Configure log archives for compliance
  • Integrate with Datadog Security Monitoring for threat detection
  • Explore Datadog Workflow Automation for incident response

Support

For assistance: