Skip to main content
Question

Granular breakdown of GCP_CLOUDAUDIT volume to optimize ingestion costs

  • November 20, 2025
  • 1 reply
  • 27 views

florinmircea
Forum|alt.badge.img+1

Hello community,

We are currently ingesting a high volume (20TB+/month) of GCP_CLOUDAUDIT logs into Google SecOps, specifically 'Data Access' logs (customer requirement ) .

I need to optimize this ingestion to reduce costs, but I lack visibility into which specific GCP Projects or Resources (e.g., specific GCS Buckets or BigQuery Tables) are generating the bulk of this volume. The default 'Data Ingestion & Health' dashboard aggregates by Log Type (e.g., 'GCP_CLOUDAUDIT') but does not break it down by the granular source project or resource name found in the payload.

I’m trying to create a Dashboard that visualizes GCP_CLOUDAUDIT volume grouped by ( as examples : )

  1. GCP Project ID (extracted from metadata.product_log_id or resource.name)
  2. Resource Name (e.g., the specific noisy storage bucket)

Has anyone built a dashboard like this or similar breakdowns that can guide me in the right direction ? 

 

Thank you in advance.

1 reply

kentphelps
Staff
Forum|alt.badge.img+11
  • Staff
  • December 1, 2025