开发者

Speeding up batch prediction in Vertex AI on Tabular data to generate explanations

开发者 https://www.devze.com 2022-12-07 23:31 出处:网络
I am trying to run a batch prediction while generating explanations for tabular data with 100 features and 1.7 million rows with storage size 1.4GB

I am trying to run a batch prediction while generating explanations for tabular data with 100 features and 1.7 million rows with storage size 1.4GB

I am finding that my batch prediction is taking 20 minutes to run when the generate_explanation flag set to False or I run on 50 rows of data. But can take 15+ hours when set to generate_explanation=True on a full dataset

I am using this method here https://cloud.google.com/python/docs/reference/aiplatform/latest/google.cloud.aiplatform.Model开发者_JAVA百科#google_cloud_aiplatform_Model_batch_predict

I have tried increasing these parameters from this

starting_replica_count=20,
max_replica_count=20,
batch_size=64

To this. The job is currently running

starting_replica_count=40,
max_replica_count=40,
batch_size=128

Are there any suggestions on how to improve the speed?

0

精彩评论

暂无评论...
验证码 换一张
取 消