Cost of training GenAI

Hi,

I can't see training cost estimator in the Google Pricing Calculator. By request Accelerators have to be enabled in Europe-west4 (or US-Central1). According to https://cloud.google.com/vertex-ai/pricing the pricing for Accelerators.

This post by @rubenszmm also covers costs, have these been confirmed?
"Is it TPU V3 64 cores, supposing is the double of 32 cores, will cost 64 USD/hour?" I was told by tech support that I need 32 TPUs enabled as it goes it in increments of 32 (I would've thought for cores not number or TPUs). So will it cost me 64 USD/hour or 32 x 64 USD/hour?

https://www.googlecloudcommunity.com/gc/AI-ML/Custom-Tune-of-LLM-in-Generative-AI-Studio-training-ti... 

Can anyone please clarify how much they've paid for 1 hour of training please. 

0 2 1,180
2 REPLIES 2

To clarify the pricing for Google Cloud TPUs (Tensor Processing Units), the cost is typically associated with the total number of TPUs(per chip-hour in USD) enabled, not the number of cores. Pricing varies by product, deployment model & region.

The breakdown is the Cloud TPU used was v3-64, and since we're talking about 32 TPUs, and the TPU v3 in europe-west4 price is $2.0000/chip-hr USD (on-demand pricing), the calculation would be 32 x $2.0000, resulting in $64.0000/chip-hr.

It's important to note that pricing details can change, so it's advisable to verify this information with Google Cloud's official documentation.

Thank you @Poala_Tenorio . How is the number of TPUs decided for the model training, can it run with 1 TPU or 64 TPUs, etc.? Or, I can add as many TPUs as I want - which will make it faster if it's a large model? Since it's a small training job , I would have imagined 1 TPU is enough.