#One Parser to Rule Them All
Instead of creating a custom parser per file:
1. Create a single “catch-all” custom log type for your app logs.
Name it something like CustomAppLogs.
All S3 feeds point to this log type.
2. Write a flexible parser that can handle multiple log formats.
Use regex patterns or JSON parsing if logs are structured differently.
Output the same UDM fields across all variations.
Example pseudo-strategy:
if line contains "ERROR":
parse timestamp, userID, error_code
elif line contains "INFO":
parse timestamp, userID, action
else:
parse whatever fields exist
3. Configure feeds in Chronicle:
Point all S3 buckets / folders to this one log type.
Make sure file deletion is off while testing.
4. Test parser:
Use a few sample logs in the parser editor.
Make sure all different log patterns correctly map to UDM fields.
5. Ingest logs:
Feed starts pulling logs automatically.
Parser runs on ingestion.
6. Verify results:
Go to Logs Explorer → filter by CustomAppLogs.
Check UDM fields. Adjust parser if something looks off.
Why this works
You reduce parser maintenance — no need to make one per file.
Handles multiple log patterns.
Cleaner feeds → easier troubleshooting
@RHYUGEN Thanks for the response.
Just to check whether one of my parser is working, i tried to ingest the logs by creating a Feed. But im not getting the raw logs auto-populated in the parser editor. I tried pasting it manually and its working. I tested the parser code and when i click validate,. it says that no raw logs are found.
Im new to SecOps Chronicle tool. First of all, could someone clearly explain how this custom parser and feeds work. I need to understand the flow.
metadata:
name: custom_app_logs_parser
display_name: Custom App Logs Parser
version: 1.0
description: >
A unified parser for multiple app log formats (INFO, ERROR, JSON).
Designed to normalize diverse log inputs into UDM fields under one log type.
log_type: custom_app_logs
author: rhyugen
last_updated: 2025-10-10
parser:
# Top-level parsing logic
expressions:
# Detect JSON logs
- name: json_parser
filter: 'startswith($raw_log, "{")'
extract_json:
json_string: $raw_log
map:
event.event_time: json.timestamp
principal.user.userid: json.user
target.action: json.action
target.resource.name: json.resource
severity: json.severity
metadata.log_format: "JSON"
# detect ERROR logs
- name: error_parser
filter: 'contains($raw_log, "ERROR")'
regex:
pattern: '(?P<timestamp>\\S+)\\s+ERROR\\s+(?P<error_code>\\w+):\\s+(?P<message>.*)'
map:
event.event_time: timestamp
event.type: "ERROR"
target.resource.name: error_code
target.details: message
severity: "HIGH"
metadata.log_format: "TEXT_ERROR"
# detect INFO logs
- name: info_parser
filter: 'contains($raw_log, "INFO")'
regex:
pattern: '(?P<timestamp>\\S+)\\s+INFO\\s+user=(?P<user>\\w+)\\s+action=(?P<action>\\w+)'
map:
event.event_time: timestamp
principal.user.userid: user
target.action: action
event.type: "INFO"
severity: "LOW"
metadata.log_format: "TEXT_INFO"
# Fallback parser (catch-all)
- name: fallback_parser
filter: 'true'
regex:
pattern: '(?P<timestamp>\\S+)\\s+(?P<message>.*)'
map:
event.event_time: timestamp
target.details: message
event.type: "UNKNOWN"
severity: "MEDIUM"
metadata.log_format: "FALLBACK"
# UDM Mapping Configuration
udm:
version: 1.0
fields:
- event.event_time
- event.type
- principal.user.userid
- target.action
- target.details
- target.resource.name
- severity
- metadata.log_format