How to process 20,000 records in Application Integration?

Hi Team,

I want to load the data which has 20,000 records into Big Query using Application Integration.

I tried to load it by creating the parent and child integration. But the performance of that is too poor. I tried with the for-each loop task. It could only load 1300 records to big query in 60 mins(Since the execution timeout was 60 mins). Is there any way to do this task more efficiently? I tried with the for-each parallel task also but got the error of resource exhausted. That is also not working. I know at this moment Application Integration cannot process 20,000 records. We have a limit of 8,000 records but to process those 8000 records it will take approximately 6-7 hrs. which is a very bad performance. 

Kindly suggest to me the most efficient way to load the bulk data to any databases. 

Thanks. 

Solved Solved
0 1 117
1 ACCEPTED SOLUTION

You might want to try to use for-each parallel + private trigger. You will need to play around with the parallel number to limit the resource. Note that, all parellel execution with private trigger will be consuming the same execution's memory threshold. 

View solution in original post

1 REPLY 1

You might want to try to use for-each parallel + private trigger. You will need to play around with the parallel number to limit the resource. Note that, all parellel execution with private trigger will be consuming the same execution's memory threshold.