Skip to main content

Hi,

I'm trying to setup Google SecOps(Chronicle) in my client’s environment. They are using legacy Chronicle and not ready to upgrade to the latest version.

I'm ingesting the logs from .log files in AWS S3 bucket. I have planned to create Feeds for each file. All my log files are from custom application and I couldn't find a matching supported log type for it. So, I created a custom log type for each of my file. Later, post creating the feeds. I wanted to create a parser to map the raw data to UDM fields. I tested my parser code by manually entering a raw log in the parser editor and it got converted to UDM fields. So, my parser code is working as expected. 

 

Now, I clicked the validate option in the parser editor to save my parser. It was checking for raw logs and validation failed. So, I created feeds using the custom log type. Feed is also working as I configured to delete the files post transferring the logs. File in the AWS bucket got deleted. I went back to validate my parser but it didn't auto populate the raw logs in the parser editor and still the validation also failed.

Could someone help me understand what's happening here and help me to ingest the logs and parse the raw logs with custom parser.
 

Thanks.

#One Parser to Rule Them All

 

Instead of creating a custom parser per file:

 

1. Create a single “catch-all” custom log type for your app logs.

 

Name it something like CustomAppLogs.

 

All S3 feeds point to this log type.

 

 

 

2. Write a flexible parser that can handle multiple log formats.

 

Use regex patterns or JSON parsing if logs are structured differently.

 

Output the same UDM fields across all variations.

 

 

Example pseudo-strategy:

 

if line contains "ERROR":

    parse timestamp, userID, error_code

elif line contains "INFO":

    parse timestamp, userID, action

else:

    parse whatever fields exist

 

 

3. Configure feeds in Chronicle:

 

Point all S3 buckets / folders to this one log type.

 

Make sure file deletion is off while testing.

 

 

 

4. Test parser:

 

Use a few sample logs in the parser editor.

 

Make sure all different log patterns correctly map to UDM fields.

 

 

 

5. Ingest logs:

 

Feed starts pulling logs automatically.

 

Parser runs on ingestion.

 

 

 

6. Verify results:

 

Go to Logs Explorer → filter by CustomAppLogs.

 

Check UDM fields. Adjust parser if something looks off.

 

Why this works

 

You reduce parser maintenance — no need to make one per file.

 

Handles multiple log patterns.

 

Cleaner feeds → easier troubleshooting


@RHYUGEN Thanks for the response.

Just to check whether one of my parser is working, i tried to ingest the logs by creating a Feed. But im not getting the raw logs auto-populated in the parser editor. I tried pasting it manually and its working. I tested the parser code and when i click validate,. it says that no raw logs are found.

Im new to SecOps Chronicle tool. First of all, could someone clearly explain how this custom parser and feeds work. I need to understand the flow.


metadata:

  name: custom_app_logs_parser

  display_name: Custom App Logs Parser

  version: 1.0

  description: >

    A unified parser for multiple app log formats (INFO, ERROR, JSON). 

    Designed to normalize diverse log inputs into UDM fields under one log type.

  log_type: custom_app_logs

  author: rhyugen

  last_updated: 2025-10-10

 

parser:

  # Top-level parsing logic

  expressions:

    # Detect JSON logs

    - name: json_parser

      filter: 'startswith($raw_log, "{")'

      extract_json:

        json_string: $raw_log

      map:

        event.event_time: json.timestamp

        principal.user.userid: json.user

        target.action: json.action

        target.resource.name: json.resource

        severity: json.severity

        metadata.log_format: "JSON"

 

    # detect ERROR logs

    - name: error_parser

      filter: 'contains($raw_log, "ERROR")'

      regex:

        pattern: '(?P<timestamp>\\S+)\\s+ERROR\\s+(?P<error_code>\\w+):\\s+(?P<message>.*)'

      map:

        event.event_time: timestamp

        event.type: "ERROR"

        target.resource.name: error_code

        target.details: message

        severity: "HIGH"

        metadata.log_format: "TEXT_ERROR"

 

    # detect INFO logs

    - name: info_parser

      filter: 'contains($raw_log, "INFO")'

      regex:

        pattern: '(?P<timestamp>\\S+)\\s+INFO\\s+user=(?P<user>\\w+)\\s+action=(?P<action>\\w+)'

      map:

        event.event_time: timestamp

        principal.user.userid: user

        target.action: action

        event.type: "INFO"

        severity: "LOW"

        metadata.log_format: "TEXT_INFO"

 

    #  Fallback parser (catch-all)

    - name: fallback_parser

      filter: 'true'

      regex:

        pattern: '(?P<timestamp>\\S+)\\s+(?P<message>.*)'

      map:

        event.event_time: timestamp

        target.details: message

        event.type: "UNKNOWN"

        severity: "MEDIUM"

        metadata.log_format: "FALLBACK"

 

# UDM Mapping Configuration

udm:

  version: 1.0

  fields:

    - event.event_time

    - event.type

    - principal.user.userid

    - target.action

    - target.details

    - target.resource.name

    - severity

    - metadata.log_format