Dual write integration not working consistently

We have dual write for wave related data.

The process defined is that when a wave is released, all work & lines data related to the wave is inserted in custom table & that custom table is synced to DW using custom entities.

There was existing code to insert data in custom table, which was inserting all work & lines related to wave in single transaction. Frequently there were more than 1000 work and lines in single wave. Due to MS limit of 1000 records, we made a change to commit each header individually & commit lines when 900 lines are inserted.

After this, we are still facing issues on prod, where orders are dropped from DW. The count is inconsistent, for e.g., it write 657 but drops 512 from the entire list. Sometimes, it writes 785 and drop rest.

We were unable to replicate the issue on UAT environment. But suddenly issue occurred yesterday for 3 waves, where none of the records synced. We paused the DW map, released new wave & the records showed up in queued records & then synced successfully.

There are no conditions to write data to custom tables, it brings all data related to wave from WHSWorkTable & WHSWorkLine. There are no filters or ranges on entity or DW map.

How to find the root cause or what can be the issue causing this problem? It is so infrequent & no difference in process, that we are unable to find the root cause.

My first thought is to enable plugin logs in the linked dataverse environment and examine what is happening when that particular entity syncs and see if an actual error is generated. If you have a UAT environment, try to recreate an order that this has occurred with before and the condition.

I’d have to go dig through some more documentation to see if any other thoughts come to mind as far as what would be happening in the instance. I’ve not used DW around anything to do with Work so I’m not sure how many joins/references are present. A 1:M thought briefly comes to mind…