GCP workflow max timout too restrictive

Is there a way we can set timeout larger? The current limit of 1800 seconds is too restrictive. I am using Workflow together with GCP Batch (each step is a GCP Batch job) and my job would have to run more than 1800 seconds. If Workflow is not able to increase the timeout limit, is there an alternative solution within GCP api that would allow me to schedule several steps of GCP Batch jobs?

1 3 990
3 REPLIES 3

Hi @gradientopt,

Welcome to Google Cloud Community!

As of the moment, per this documentation on invoking an HTTP endpoint, maximum timeout is 1800 seconds before throwing an exception.

I would suggest filing a feature request so that our engineers could take a look at this. Please be advised that we don't have a specific ETA for this one but you can keep track of its progress once the ticket has been created.

I am facing the same difficulty when using GCP Batch. Is there a way to address it? Now the outcome is that my workflow indicates it failed due to the 1800s timeout but the batch job will keep running normally. 

For those interested in increased Cloud Workflow timeouts, please DM me with the following details to help inform the Workflow team how to best advise.

  • Overall objectives and use case
  • Any system constraints
  • expected performance/latency requirements
  • traffic volumes and regions
  • concurrent operations (if any)
  • up and downstream APIs being used