Skip to main content

Multi Tenant SIEM instances

  • June 25, 2024
  • 5 replies
  • 282 views

migueltubia
Forum|alt.badge.img+4

Hi all,

As far as I know, it is possible to use Chronicle SIEM in multi-tenant environments, and using labels you can "separate" the information for each client. I would like to ask some doubts about this approach:

  • Since several clients use the same Chronicle instance, how is the information for each client separated? We understand that it is not a physical separation, but a logical one. Are there any details on this?
  • Do you recommend using the "environment" field for this separation of clients or does it have another function?
  • Also, since the rules are executed for "all events" matched in the rule, what would be the good practice to delimit/not mix the analysis between clients? Is it done automatically based on some field? Should this logic be added to each rule? We have not seen any documentation on this but we understand that the logic of the rules must contemplate this multi-tenancy, it does not seem to be something internal.

Thanks for your help.

Regards.

M.

5 replies

jpetitg
Forum|alt.badge.img+2
  • Bronze 1
  • June 26, 2024

Hi,

What you are describing here is the new feature called DataRBAC.

DataRBAC feature has been announced only since last week for all (18th of June).

Segregation uses scopes to delimitate which data a user or group of users can search on Google SecOps.

Those scopes can be log types, namespaces, ingestion labels and/or custom UDM searches to match what is needed.

Scoped users will have a popup mentionning that they do not see all the logs on the platform and apart from that, it is pretty transparent. They perform RLS or UDM searches and get results only on logs matching their scope.

For your custom detection rules their can either be global (leveraging the totality of the logs present in the platform) or binded to a scope and therefore run only on data on a dedicated scope.

Official documentation on this feature can be found here: https://cloud.google.com/chronicle/docs/administration/datarbac-overview


migueltubia
Forum|alt.badge.img+4
  • Author
  • Bronze 1
  • June 27, 2024

Hi!

thanks for your response.

As I understand it, in this case with scopes I should have all the detection rules duplicated, one for each customer. This is useful in the case of custom rules, for example.
Some rules are the same for all clients, is there a way to run these rules without scope, which accesses all the data, but which makes a grouping/separation by client? Taking into account that the client data comes in a field of the logs.

Thanks again!

M.


AbdElHafez
Staff
Forum|alt.badge.img+12
  • Staff
  • June 27, 2024

Hi,

  • How are you ingesting the events that your rule is monitoring ? Are they all ingested through forwarders ?

Thanks.


migueltubia
Forum|alt.badge.img+4
  • Author
  • Bronze 1
  • July 4, 2024

Hi!

thanks for your response.

We are ingesting from forwarders AND Feeds. We are adding a custom field to know from which customer is the events. Then, we are thinking on implementing a condition in the custom rules, so the correlation will not mix events from different customers...

But we are not sure how it works with curated rules...

The scopes will force us to create duplicated for all the rules, and "attach" them to the scopes, but the maintenance will be a hell....

Regards.

M.


AbdElHafez
Staff
Forum|alt.badge.img+12
  • Staff
  • July 10, 2024

I was going to suggest to include the custom fields you are using as a cutomer ID in the match variables, so that the matches will be specific per customer.

Thanks,

Hafez