Skip to main content
Solved

Google SecOps SOAR Jobs Resource Limits

  • March 14, 2025
  • 2 replies
  • 66 views

Forum|alt.badge.img+6

Hello everyone,

I’m trying to better understand the resource limitations when running jobs in Google SecOps SOAR so I can figure out for what use cases I can use them. Specifically, I’d like to know:

  • What is the maximum RAM and CPU usage a single job can reach?
  • What is the maximum size of data a job can process ?

Any insights, documentation references, or personal experiences would be greatly appreciated!

Thanks in advance.

Best answer by SoarAndy

Though I don't have a specific answer: They use cloud functions and not a local VM, so I don't know how to answer that. 

Though I would say that jobs are designed for frequent running (default usage is every 10 seconds) for small amounts of data.  

If you want to run something bigger and less frequent, I suggest using a native CloudFunction, or worse case use a CRON Connector to create a new playbook every day/hour and have the processing in here (though again, playbooks are not designed for huge data processing this is not the point in SOAR)

What did you have in mind?

2 replies

SoarAndy
Staff
Forum|alt.badge.img+12
  • Staff
  • Answer
  • March 14, 2025

Though I don't have a specific answer: They use cloud functions and not a local VM, so I don't know how to answer that. 

Though I would say that jobs are designed for frequent running (default usage is every 10 seconds) for small amounts of data.  

If you want to run something bigger and less frequent, I suggest using a native CloudFunction, or worse case use a CRON Connector to create a new playbook every day/hour and have the processing in here (though again, playbooks are not designed for huge data processing this is not the point in SOAR)

What did you have in mind?


Forum|alt.badge.img+6
  • Author
  • Bronze 5
  • March 19, 2025

Though I don't have a specific answer: They use cloud functions and not a local VM, so I don't know how to answer that. 

Though I would say that jobs are designed for frequent running (default usage is every 10 seconds) for small amounts of data.  

If you want to run something bigger and less frequent, I suggest using a native CloudFunction, or worse case use a CRON Connector to create a new playbook every day/hour and have the processing in here (though again, playbooks are not designed for huge data processing this is not the point in SOAR)

What did you have in mind?


Thank you for the answer @SoarAndy 

I will be trying to process data with a size around 200 MB daily. I think I will just use cloud functions for that.