Skip to main content

Hi folks,

I’m using the following Python code to retrieve ingested log types from the last 90 days. The code works fine in existing SIEM environments, but since about two months ago, whenever I run it on newly deployed SIEMs, I get the following error:

 

Impossible to retrieve log types from BigQuery: 404 Not found: Table chronicle-clientname:datalake.events was not found in location EU

All SIEMs are deployed in the same region, and the issue only seems to affect the new environments.

Code snippet

def get_log_type(self):
"""
Retrieves log types that have been ingested into the SIEM system.

Returns:
list: A list containing log types that were active in the past 90 days
False: If an exception occurs.
"""
try:
# Retrieve BigQuery credentials
BQ_SCOPES = ["https://www.googleapis.com/auth/cloud-platform"]
bq_credentials = service_account.Credentials.from_service_account_info(
self.bigquery_api_json, scopes=BQ_SCOPES
)
except Exception:
logger.exception("Impossible to retrieve BigQuery credentials")
return False

try:
# Create BigQuery client
bq_client = bigquery.Client(
credentials=bq_credentials, project=bq_credentials.project_id
)
## for query check testsiem1
query = """
SELECT
metadata.log_type AS events_metadata__log_type
FROM `datalake.events` AS events
WHERE (((( (TIMESTAMP_MICROS(IFNULL(metadata.event_timestamp.seconds, 0) * 1000000 + CAST((IFNULL(metadata.event_timestamp.nanos, 0) / 1000) as INT64))) ) >= ((TIMESTAMP(DATETIME_ADD(DATETIME(TIMESTAMP_TRUNC(TIMESTAMP_TRUNC(CURRENT_TIMESTAMP(), DAY), MONTH)), INTERVAL -2 MONTH)))) AND ( (TIMESTAMP_MICROS(IFNULL(metadata.event_timestamp.seconds, 0) * 1000000 + CAST((IFNULL(metadata.event_timestamp.nanos, 0) / 1000) as INT64))) ) < ((TIMESTAMP(DATETIME_ADD(DATETIME(TIMESTAMP(DATETIME_ADD(DATETIME(TIMESTAMP_TRUNC(TIMESTAMP_TRUNC(CURRENT_TIMESTAMP(), DAY), MONTH)), INTERVAL -2 MONTH))), INTERVAL 3 MONTH)))))))
GROUP BY
1
ORDER BY
1
LIMIT 500
"""

query_job = bq_client.query(query)
q_results = query_job.result()
batch_log_entries = [
row.events_metadata__log_type
for row in q_results
if row.events_metadata__log_type
]

logger.debug(f"+++ Log types found: {len(batch_log_entries)}")
return batch_log_entries
except Exception as e:
logger.exception(
f"Impossible to retrieve log types from BigQuery: {str(e)}"
)
return False

Things I tried

  • Verified that the same code works in older SIEM environments.

  • Checked that all environments are deployed in the same region.

  • Confirmed that the service account has access to BigQuery.

  • Reviewed the dataset reference (datalake.events) in the query.

Question

Has anyone seen this issue before? Could the error be related to dataset location, naming differences, or something else with new Chronicle SIEM deployments?

Any guidance would be appreciated.

Thanks in advance!

As of this Spring Google no longer provides an embedded BigQuery for any license level below Enterprise+. There have been e-mail notifications and this is also published on the Feature deprecations page: https://cloud.google.com/chronicle/docs/deprecations

 

If it's an E+ instance, I’d open a support ticket with the issue. For licenses below E+, you can use the recently launched Bring-your-own BigQuery (BYOBQ): https://cloud.google.com/chronicle/docs/reports/export-to-customer-managed-project

 

-mike


Reply