Datadog Integration
Datadog Integration Guide
Forward Keycloak authentication events and logs to Datadog for modern observability, security monitoring, and unified infrastructure visibility.
Overview
This guide shows you how to configure Skycloak to send security events to Datadog using the HTTP Logs API. Datadog is ideal for organizations wanting unified observability across application, infrastructure, and security data.
What You’ll Achieve
- Real-time authentication event streaming to Datadog
- Unified security and application monitoring
- Custom dashboards and monitors
- Machine learning-powered anomaly detection
Prerequisites
Before starting, ensure you have:
-
Skycloak Requirements
- Enterprise plan with SIEM Integration add-on
- Workspace Owner or Admin permissions
-
Datadog Requirements
- Active Datadog account (Pro or Enterprise plan recommended)
- API key with Logs Write permissions
- Access to Datadog Logs Explorer
Step 1: Get Datadog API Key
Create or Get Existing API Key
- Log in to Datadog
- Navigate to Organization Settings (bottom left) → API Keys
- Option A - Use existing key with logs permissions
- Option B - Create new key:
- Click New Key
- Name:
Skycloak Keycloak Events - Click Create Key
- Copy and save the API key (won’t be shown again)
Determine Datadog Site
Your Datadog site determines the API endpoint URL:
| Site | Region | API Endpoint |
|---|---|---|
| US1 (default) | US | https://http-intake.logs.datadoghq.com |
| US3 | US | https://http-intake.logs.us3.datadoghq.com |
| US5 | US | https://http-intake.logs.us5.datadoghq.com |
| EU1 | Europe | https://http-intake.logs.datadoghq.eu |
| AP1 | Asia Pacific | https://http-intake.logs.ap1.datadoghq.com |
Find your site in Datadog: Check the URL (e.g., https://app.datadoghq.com = US1, https://app.datadoghq.eu = EU1)
Step 2: Configure Skycloak SIEM Destination
- Log in to your Skycloak dashboard
- Navigate to SIEM Integration in the left sidebar
- Click Add Destination
- Configure the destination:
Basic Information
-
Name:
Datadog Production -
Destination Type:
HTTP Webhook
Connection Details
Webhook URL: Use your Datadog site endpoint with API key
https://http-intake.logs.datadoghq.com/api/v2/logs?dd-api-key=YOUR_API_KEY
Replace:
-
datadoghq.comwith your site domain (e.g.,datadoghq.eufor EU1) -
YOUR_API_KEYwith your actual API key
Authentication Type: None (API key is in URL)
Custom Headers: Add this JSON:
{
"Content-Type": "application/json",
"DD-EVP-ORIGIN": "skycloak",
"DD-EVP-ORIGIN-VERSION": "1.0"
}Event Filtering
Select events to forward:
-
LOGIN- Successful logins -
LOGIN_ERROR- Failed login attempts -
LOGOUT- User logouts -
UPDATE_PASSWORD- Password changes -
UPDATE_EMAIL- Email changes -
REGISTER- New user registrations -
CODE_TO_TOKEN- Token exchanges -
REFRESH_TOKEN- Token refreshes
Include Server Logs: Enable for detailed application logs (increases ingestion costs)
Batching Configuration
-
Batch Size:
500(Datadog accepts up to 5MB per request) -
Batch Interval:
60seconds (recommended for near-real-time)
- Click Complete Setup
- Toggle Enable to start forwarding events
Step 3: Verify Data Flow
Check Logs Explorer
Navigate to Logs → Explorer
-
Use this search query:
source:keycloak service:authentication Events should appear within 2-3 minutes
Verify Log Attributes
Click on any log entry and verify these attributes are present:
-
event_type- Event name (LOGIN, LOGIN_ERROR, etc.) -
username- User identifier -
ip_address- Source IP -
realm- Keycloak realm -
timestamp- Event time -
source- Should show “keycloak”
Test with Login Event
- Log in to one of your Keycloak realms
- Wait 2-3 minutes for batch processing
- Search in Logs Explorer:
source:keycloak event_type:LOGIN
Step 4: Create Log Pipelines
Set Up Keycloak Log Pipeline
- Navigate to Logs → Pipelines
- Click New Pipeline
- Configure:
-
Name:
Keycloak Events -
Filter:
source:keycloak
-
Name:
Add Processors
Processor 1: Grok Parser (if needed for custom formatting)
rule %{data:event_type} %{data:username} from %{ip:ip_address}
Processor 2: Date Remapper
-
Name:
Remap timestamp -
Attribute:
timestamp - Check “Preserve source attribute”
Processor 3: Status Remapper
-
Name:
Map event type to status -
Attribute:
event_type - Mapping:
-
LOGIN_ERROR→error -
LOGOUT_ERROR→error -
LOGIN→info -
LOGOUT→info - Default →
info
-
Processor 4: Category Processor
-
Name:
Categorize events -
Target:
category - Rules:
- Contains
LOGIN→authentication - Contains
PASSWORD→account_management - Contains
EMAIL→account_management - Contains
REGISTER→user_registration
- Contains
- Click Save
Step 5: Create Monitors
Example: Failed Login Monitor
- Navigate to Monitors → New Monitor
- Select Logs
- Configure:
Define the search query:
source:keycloak event_type:LOGIN_ERROR
Set alert conditions:
- Threshold: Above 10 (adjust based on your baseline)
- Time window: 5 minutes
-
Group by:
username,ip_address
Configure notifications:
-
Title:
High Failed Login Attempts - Keycloak {{username.name}} -
Message:
Multiple failed login attempts detected: - Username: {{username.name}} - IP Address: {{ip_address.name}} - Count: {{value}} - Time: {{last_triggered_at}} @slack-security-team @pagerduty-security
- Click Create
Example: New User Registration Monitor
Alert on new user registrations:
Search query:
source:keycloak event_type:REGISTER
Alert conditions:
- Threshold: Above or equal to 1
- Evaluation window: 5 minutes
-
Group by:
realm
Example: Account Modification Monitor
Alert on privileged account changes:
Search query:
source:keycloak event_type:(UPDATE_PASSWORD OR UPDATE_EMAIL) username:*admin*
Alert conditions:
- Threshold: Above or equal to 1
- Evaluation window: 1 minute
Step 6: Create Dashboards
Example: Authentication Overview Dashboard
- Navigate to Dashboards → New Dashboard
- Name:
Keycloak Authentication Monitoring - Add widgets:
Widget 1: Login Success Rate (Query Value)
source:keycloak event_type:(LOGIN OR LOGIN_ERROR)
| stats count by event_type
| eval success_rate = (count{event_type:LOGIN} / (count{event_type:LOGIN} + count{event_type:LOGIN_ERROR})) * 100
Widget 2: Failed Logins Timeline (Timeseries)
source:keycloak event_type:LOGIN_ERROR
| timeseries count by event_type
Widget 3: Top Failed Login Users (Top List)
source:keycloak event_type:LOGIN_ERROR
| top username by count
Widget 4: Login Activity by Realm (Pie Chart)
source:keycloak event_type:LOGIN
| stats count by realm
Widget 5: Geographic Login Distribution (Geomap)
source:keycloak event_type:LOGIN
| stats count by @network.client.geoip.country.name
Widget 6: Recent Account Changes (Log Stream)
source:keycloak event_type:(UPDATE_PASSWORD OR UPDATE_EMAIL OR REGISTER OR DELETE_ACCOUNT)
- Click Save
Useful Datadog Queries
Recent Failed Logins
source:keycloak event_type:LOGIN_ERROR
| sort -@timestamp
| limit 100
User Login History
source:keycloak event_type:LOGIN username:john.doe
| sort -@timestamp
Login Activity by Time
source:keycloak event_type:LOGIN
| timeseries count by 1h
Suspicious Multiple IP Logins
source:keycloak event_type:LOGIN
| stats dc(ip_address) as unique_ips by username
| where unique_ips > 3
Account Modification Summary
source:keycloak event_type:(UPDATE_PASSWORD OR UPDATE_EMAIL)
| stats count by event_type, username
| sort -count
Failed Login Rate by Realm
source:keycloak event_type:(LOGIN OR LOGIN_ERROR)
| stats count by realm, event_type
| eval error_rate = (count{event_type:LOGIN_ERROR} / (count{event_type:LOGIN} + count{event_type:LOGIN_ERROR})) * 100
| sort -error_rate
Log Facets
Create Custom Facets
To make Keycloak attributes searchable:
- Navigate to Logs → Explorer
- Find a Keycloak log entry
- Click on attribute (e.g.,
realm,client_id) - Click Create facet for @realm
- Configure:
- Type: String (most common)
-
Facet name:
Keycloak Realm -
Description:
Keycloak realm name - Group: Select or create “Authentication”
Recommended facets:
-
realm→ String facet -
event_type→ String facet -
username→ String facet -
ip_address→ String facet -
client_id→ String facet (from details object)
Integration with APM
Correlate Logs with Traces
If you’re using Datadog APM for your applications:
source:keycloak event_type:LOGIN
| join service:your-app-service on [email protected]
This correlates Keycloak authentication events with your application traces.
Troubleshooting
No Logs Appearing in Datadog
-
Check Skycloak destination status:
- Navigate to SIEM Integration page
- Verify “Healthy” status
- Check “Events Sent” metric is increasing
-
Verify API key:
- Ensure API key is correct and not expired
- Check key has Logs Write permission
- Test key manually:
curl -X POST "https://http-intake.logs.datadoghq.com/api/v2/logs" \ -H "DD-API-KEY: YOUR_API_KEY" \ -H "Content-Type: application/json" \ -d '[{"message":"test","service":"test"}]' -
Check Datadog site:
- Ensure webhook URL matches your Datadog site
- US:
.com, EU:.eu, etc.
-
Review Datadog Logs status page:
- Navigate to https://status.datadoghq.com
- Check for any API or Logs ingestion issues
Logs Not Appearing in Search
-
Check filters: Remove all filters and search for
source:keycloak - Check time range: Extend to “Past 1 Hour”
- Check log retention: Verify logs are within retention period
- Verify index: Logs should be in default index
High Ingestion Costs
- Filter events: Only forward critical event types
- Reduce batch frequency: Increase batch interval to 300 seconds
- Disable server logs: Turn off “Include Logs” if not needed
- Set up log sampling: Configure sampling rules in Datadog
- Use exclusion filters: Exclude noisy events you don’t need
Best Practices
- Tag Your Logs: Use consistent tagging strategy (environment, team, service)
- Set Up Indexes: Create dedicated index for Keycloak logs if high volume
- Configure Retention: Balance retention needs with cost (default: 15 days)
- Use Log Archives: Archive to S3 for long-term compliance needs
- Create Saved Views: Save common search queries for quick access
- Set Up SLOs: Create Service Level Objectives for authentication success rate
- Enable Log Rehydration: For investigating historical incidents
Next Steps
- Create additional monitors for your specific security requirements
- Build comprehensive dashboards for your security team
- Set up log-based metrics for long-term trending
- Configure log archives for compliance
- Integrate with Datadog Security Monitoring for threat detection
- Explore Datadog Workflow Automation for incident response
Support
For assistance: