Azure Sentinel Integration
Azure Sentinel Integration Guide
Forward Keycloak authentication events and logs to Microsoft Azure Sentinel for cloud-native security monitoring, AI-powered threat detection, and seamless Microsoft ecosystem integration.
Overview
This guide walks you through configuring Skycloak to send security events to Azure Sentinel using HTTP webhooks and the Azure Monitor HTTP Data Collector API. Azure Sentinel is ideal for organizations using Microsoft 365, Azure AD, and Azure cloud services.
What You’ll Achieve
- Real-time authentication event streaming to Azure Sentinel
- Integration with Azure’s AI-powered threat detection
- Unified security monitoring across Microsoft services
- KQL-based queries and custom workbooks
Prerequisites
Before starting, ensure you have:
-
Skycloak Requirements
- Enterprise plan with SIEM Integration add-on
- Workspace Owner or Admin permissions
-
Azure Requirements
- Active Azure subscription
- Azure Sentinel workspace (Log Analytics workspace with Sentinel enabled)
- Permission to create Data Connectors and access workspace keys
Step 1: Set Up Azure Sentinel Workspace
Create Log Analytics Workspace (if needed)
Sign in to Azure Portal
Search for Log Analytics workspaces
Click Create → Log Analytics workspace
-
Configure:
- Subscription: Select your subscription
- Resource Group: Create or select existing
-
Name:
skycloak-sentinel-workspace - Region: Choose region closest to your Skycloak deployment
- Pricing Tier: Pay-as-you-go (default)
Click Review + Create → Create
Enable Azure Sentinel
- Search for Microsoft Sentinel
- Click Create → Add Microsoft Sentinel to a workspace
- Select your Log Analytics workspace
- Click Add
Step 2: Create Custom Data Connector
Get Workspace Credentials
- Navigate to your Log Analytics workspace
- Click Agents (under Settings)
- Copy and save:
-
Workspace ID: Long alphanumeric string (e.g.,
a1b2c3d4-...) - Primary Key: Base64-encoded key
-
Workspace ID: Long alphanumeric string (e.g.,
Determine Data Collector API Endpoint
The endpoint format is:
https://<WorkspaceID>.ods.opinsights.azure.com/api/logs?api-version=2016-04-01
Replace <WorkspaceID> with your Workspace ID.
Example:
https://a1b2c3d4-e5f6-7890-abcd-ef1234567890.ods.opinsights.azure.com/api/logs?api-version=2016-04-01
Step 3: Configure Skycloak SIEM Destination
- Log in to your Skycloak dashboard
- Navigate to SIEM Integration in the left sidebar
- Click Add Destination
- Configure the destination:
Basic Information
-
Name:
Azure Sentinel Production -
Destination Type:
HTTP Webhook
Connection Details
- Webhook URL: Your Data Collector API endpoint (from Step 2)
-
Authentication Type:
Bearer Token - Auth Token: Your Log Analytics Primary Key (from Step 2)
-
Custom Headers: Add this JSON:
{ "Log-Type": "KeycloakEvents", "time-generated-field": "timestamp" }
Important:
-
Log-Typedefines the custom log table name (will beKeycloakEvents_CLin Sentinel) -
time-generated-fieldtells Azure which field contains the event timestamp
Event Filtering
Select events to forward:
-
LOGIN- Successful logins -
LOGIN_ERROR- Failed login attempts -
LOGOUT- User logouts -
UPDATE_PASSWORD- Password changes -
UPDATE_EMAIL- Email changes -
REGISTER- New user registrations -
CODE_TO_TOKEN- Authorization code exchanges -
REFRESH_TOKEN- Token refreshes
Include Server Logs: Enable for detailed application logs (increases costs)
Batching Configuration
-
Batch Size:
500(Azure accepts up to 30MB per request) -
Batch Interval:
120seconds
- Click Complete Setup
- Toggle Enable to start forwarding events
Step 4: Verify Data Flow
Check Azure Monitor Logs
- Navigate to your Log Analytics workspace
- Click Logs (under General)
- Run this KQL query:
KeycloakEvents_CL
| take 10
Note: It may take 5-15 minutes for the custom table to appear after first data ingestion.
Verify Table Schema
Check the automatically created table schema:
KeycloakEvents_CL
| getschema
Test with Recent Login Event
- Log in to one of your Keycloak realms
- Wait 3-5 minutes for batch processing and ingestion
- Run this query:
KeycloakEvents_CL
| where event_type_s == "LOGIN"
| order by TimeGenerated desc
| take 1
Note: Azure automatically adds _s suffix to string fields, _d to numbers, _b to booleans.
Step 5: Create Analytics Rules
Example: Detect Brute Force Attacks
- Navigate to Microsoft Sentinel → Analytics
- Click Create → Scheduled query rule
- Configure:
General:
-
Name:
Keycloak Brute Force Attempt -
Description:
Detects multiple failed login attempts from same IP - Severity: High
- MITRE ATT&CK: Credential Access → Brute Force (T1110)
Rule Query:
KeycloakEvents_CL
| where event_type_s == "LOGIN_ERROR"
| where TimeGenerated > ago(5m)
| summarize FailedAttempts=count() by ip_address_s, username_s, bin(TimeGenerated, 5m)
| where FailedAttempts >= 5
| project TimeGenerated, username_s, ip_address_s, FailedAttempts
Query Scheduling:
- Run query every: 5 minutes
- Lookup data from the last: 5 minutes
Alert Threshold: Greater than 0 results
Entity Mapping:
-
Account:
username_s -
IP:
ip_address_s
- Click Review and Create → Create
Example: Detect Suspicious Login Locations
Alert on logins from unusual geographic locations:
let UserLoginLocations = KeycloakEvents_CL
| where event_type_s == "LOGIN"
| where TimeGenerated > ago(30d)
| extend Location = geo_info_from_ip_address(ip_address_s)
| summarize KnownCountries=make_set(Location.country) by username_s;
KeycloakEvents_CL
| where event_type_s == "LOGIN"
| where TimeGenerated > ago(5m)
| extend Location = geo_info_from_ip_address(ip_address_s)
| join kind=inner UserLoginLocations on username_s
| where not(Location.country in (KnownCountries))
| project TimeGenerated, username_s, ip_address_s, NewCountry=Location.country
Example: Privileged Account Changes
Alert on password changes for admin accounts:
KeycloakEvents_CL
| where event_type_s == "UPDATE_PASSWORD"
| where username_s contains "admin" or username_s contains "root"
| project TimeGenerated, username_s, ip_address_s, realm_s
Step 6: Create Custom Workbooks
Example: Authentication Monitoring Workbook
- Navigate to Microsoft Sentinel → Workbooks
- Click Add workbook → Edit
- Add visualizations:
Query 1: Login Success Rate (Donut Chart)
KeycloakEvents_CL
| where event_type_s in ("LOGIN", "LOGIN_ERROR")
| where TimeGenerated > ago(24h)
| summarize Count=count() by event_type_s
| render piechart
Query 2: Failed Logins Timeline (Line Chart)
KeycloakEvents_CL
| where event_type_s == "LOGIN_ERROR"
| where TimeGenerated > ago(7d)
| summarize FailedLogins=count() by bin(TimeGenerated, 1h)
| render timechart
Query 3: Top Failed Login Users (Bar Chart)
KeycloakEvents_CL
| where event_type_s == "LOGIN_ERROR"
| where TimeGenerated > ago(24h)
| summarize Failures=count() by username_s
| top 10 by Failures desc
| render barchart
Query 4: Login Activity Map
KeycloakEvents_CL
| where event_type_s == "LOGIN"
| where TimeGenerated > ago(24h)
| extend Location = geo_info_from_ip_address(ip_address_s)
| summarize Logins=count() by Location.country, Location.latitude, Location.longitude
| render scatterchart with (kind=map)
- Click Done Editing → Save As
- Name:
Keycloak Authentication Monitoring
Useful KQL Queries
Recent Failed Logins
KeycloakEvents_CL
| where event_type_s == "LOGIN_ERROR"
| where TimeGenerated > ago(1h)
| project TimeGenerated, username_s, ip_address_s, realm_s
| order by TimeGenerated desc
User Login History
KeycloakEvents_CL
| where event_type_s == "LOGIN"
| where username_s == "john.doe"
| where TimeGenerated > ago(30d)
| project TimeGenerated, ip_address_s, realm_s, client_id=details_s.client_id
| order by TimeGenerated desc
Login Activity by Realm
KeycloakEvents_CL
| where event_type_s == "LOGIN"
| where TimeGenerated > ago(7d)
| summarize LoginCount=count() by realm_s
| order by LoginCount desc
Account Changes Summary
KeycloakEvents_CL
| where event_type_s in ("UPDATE_PASSWORD", "UPDATE_EMAIL", "REGISTER", "DELETE_ACCOUNT")
| where TimeGenerated > ago(24h)
| summarize Count=count() by event_type_s, username_s
| order by Count desc
Concurrent Login Detection
KeycloakEvents_CL
| where event_type_s == "LOGIN"
| where TimeGenerated > ago(1h)
| summarize UniqueIPs=dcount(ip_address_s) by username_s, bin(TimeGenerated, 5m)
| where UniqueIPs > 2
| project TimeGenerated, username_s, UniqueIPs
Integration with Microsoft Services
Azure AD Integration
Correlate Keycloak events with Azure AD sign-ins:
KeycloakEvents_CL
| where event_type_s == "LOGIN"
| join kind=inner (
SigninLogs
| where TimeGenerated > ago(1h)
) on $left.username_s == $right.UserPrincipalName
| project KeycloakLogin=TimeGenerated, AzureADLogin=TimeGenerated1, username_s, ip_address_s
Microsoft 365 Correlation
Find users with suspicious activity in both Keycloak and M365:
let SuspiciousKeycloakUsers = KeycloakEvents_CL
| where event_type_s == "LOGIN_ERROR"
| where TimeGenerated > ago(1h)
| summarize FailedAttempts=count() by username_s
| where FailedAttempts > 5;
OfficeActivity
| where TimeGenerated > ago(1h)
| join kind=inner SuspiciousKeycloakUsers on $left.UserId == $right.username_s
| project TimeGenerated, Operation, UserId, ClientIP
Troubleshooting
No Data Appearing in Sentinel
-
Check Skycloak destination status:
- Navigate to SIEM Integration page
- Verify “Healthy” status
- Check “Events Sent” metric is increasing
-
Verify Workspace ID and Key:
- Ensure Workspace ID matches your Log Analytics workspace
- Verify Primary Key is correct (not Secondary Key)
- Check for trailing spaces in credentials
-
Test HTTP endpoint manually:
# Use curl to test (replace placeholders): curl -X POST \ -H "Content-Type: application/json" \ -H "Log-Type: KeycloakEvents" \ -H "Authorization: SharedKey <WorkspaceID>:<SharedKey>" \ "https://<WorkspaceID>.ods.opinsights.azure.com/api/logs?api-version=2016-04-01" \ -d '[{"test": "data", "timestamp": "2024-01-15T10:00:00Z"}]' -
Check Azure Monitor Logs ingestion:
- Navigate to your workspace → Usage and estimated costs
- Verify data is being ingested
Custom Table Not Appearing
- Wait 5-15 minutes: Azure takes time to create custom tables on first ingestion
- Check Log-Type header: Must be in Custom Headers configuration
- Verify data format: Azure expects JSON array with timestamp field
Query Performance Issues
-
Use time filters: Always filter by
TimeGeneratedfor better performanceKeycloakEvents_CL | where TimeGenerated > ago(1h) // Always add this -
Limit result sets: Use
takeortopto limit results - Optimize joins: Join on indexed fields when possible
High Ingestion Costs
- Filter events: Only forward critical event types
- Reduce batch frequency: Increase batch interval to 300-600 seconds
- Disable server logs: Turn off “Include Logs” if not needed
-
Set retention policies: Configure data retention to manage costs
- Navigate to workspace → Usage and estimated costs → Data Retention
Best Practices
- Use Managed Identity: For production, use Azure Managed Identity instead of shared keys (requires Azure-hosted connector)
- Enable Diagnostic Logging: Turn on Azure Monitor diagnostic logs for troubleshooting
- Create Automation Playbooks: Use Azure Logic Apps to automate incident response
- Set Up Notifications: Configure email/SMS alerts for critical incidents
- Regular Rule Tuning: Review and adjust analytics rules based on false positives
- Data Retention: Balance retention needs with cost (default: 90 days)
- Use Watchlists: Create watchlists for VIP users, known good IPs, etc.
Next Steps
- Create custom analytics rules for your security requirements
- Build comprehensive workbooks for monitoring
- Set up automation playbooks for incident response
- Integrate with Azure Security Center for unified security posture
- Configure data retention policies based on compliance needs
- Explore Microsoft Sentinel content hub for pre-built solutions
Support
For assistance: