Application Integration - mysql connector task

np1
Bronze 3
Bronze 3

I am trying to copy data from one mySql table to another mySql table, but while fetching data only 25 rows are pass to another flow(to load data to another table) instead of full data.

Below are my configurations,

Flow1: to fetch data from 1st table

np1_1-1704279478004.pngnp1_2-1704279561070.png

np1_3-1704279615595.pngnp1_4-1704279745309.pngnp1_5-1704279777302.pngnp1_6-1704279792712.png

np1_0-1704279410009.png

flow2 : to load data to 2nd table

After testing flow1 it is in succeeded state always but only 25 rows are inserted.  Can anyone suggest here how to fetch full data.

Solved Solved
0 5 329
1 ACCEPTED SOLUTION

Hi @np1 

Spookly enough we were just looking at this yesterday!

On your mapping step then you can set the ListEntitiesPageSize, so you can load in many more rows. Depending on your payload size you maybe able to get say 500 rows at a time. (There is a connector payload limit of 8M according to https://cloud.google.com/application-integration/docs/quotas#data-processing-limits)

If you need more rows you will really need to loop and when you call the first 'page' of data consume a returned variable called   listEntitiesPageToken. This then has to be fed into your next loop iteration to get say the 2nd page of data.

More into on it can be found here:

https://cloud.google.com/application-integration/docs/configure-connectors-task#config-prop

I havent been able to test the latter on Postgres, as it doesn't appear to be returning a 'listEntitiesPageToken', but hopefully mySql is better.

 

View solution in original post

5 REPLIES 5

Hi @np1 

Spookly enough we were just looking at this yesterday!

On your mapping step then you can set the ListEntitiesPageSize, so you can load in many more rows. Depending on your payload size you maybe able to get say 500 rows at a time. (There is a connector payload limit of 8M according to https://cloud.google.com/application-integration/docs/quotas#data-processing-limits)

If you need more rows you will really need to loop and when you call the first 'page' of data consume a returned variable called   listEntitiesPageToken. This then has to be fed into your next loop iteration to get say the 2nd page of data.

More into on it can be found here:

https://cloud.google.com/application-integration/docs/configure-connectors-task#config-prop

I havent been able to test the latter on Postgres, as it doesn't appear to be returning a 'listEntitiesPageToken', but hopefully mySql is better.

 

As per your suggestion I have setup ListEntitiesPageSize to 1000 and listEntitiesPageToken to listEntitiesNextPageToken as below,

np1_0-1704358059959.png

Does it make sense, I just try to fetch data if page size exceed with 1000 lines but this is not working..

Hi @np1 

Pls take a look at this thread for fetching large payload response from a connector: https://www.googlecloudcommunity.com/gc/Integration-Services/API-Pagination-in-Application-Integrati...

 

Lmk if this answers your question

Hi @Meenchou ,

I have tried this pagination flow but i am getting below error,

np1_0-1704895016506.png

 

Hi @np1 

I was having trouble getting the Nextpagetoken back so raised a case with google. It looks like you have to both map a pagesize and a sort column to get the token back (listEntitiesPageSize, listEntitiesSortByColumns-[NB String-Array] ).  I have managed to do this and get a token back, which I then fed into a 2nd connector call (with also pagesize and sort column mapped) and it then returned the subsequent 25 rows.

Not sure why you have to specify a pagesize to get the token back, instead of it just picking up the default of 25!

It also looks like if you have a primary key on the table you dont need the sort definition (but I havent tested that).

Good luck