Explore Courses Blog Tutorials Interview Questions
0 votes
1 view
in Salesforce by (11.9k points)

it's been a couple releases since I've had to do a S2S integration, but I ran into an unexpected issue that hopefully someone can solve more effectively.

I have two orgs, sharing contacts over S2S.

Contacts in each org have the identical schema, it's standard fields plus custom fields. I've reproduced a base case with just two custom fields: checkbox field A, and Number(18,0) field B.

Org 1 publishes field A, and subscribes to field B.

Org 2 subscribes to field A, and publishes field B.

Org 1 initiates all S2S workflow by sharing contacts to Org 2 over S2S. Org 2 has auto-accept on.

Org 2 has a Contact Before Insert trigger that simply uses field A to calculate the value for field B. e.g. if field A is checked, populate field B with 2, if unchecked, 0. (This of course is a drastic over-simplification of what I really need to do, but it's the base reproducible case.)

That all works fine in Org 2 - contacts come across fine with field A, and I see the field results get calculated into field B.

The problem is that the result - field B - does not get auto-shared back to Org 1 until the next contact update. It can be as simple as me editing a non-shared field on that same contact, like "Description", in Org 2, and then I instantly see the previously calculated value of field B get pushed back to Org 1.

I'm assuming that this is because, since the calculation of field B is occurring within a Before Insert, the S2S connection assumes the current update transaction was only performed by itself (I can see how this logic would make sense to prevent infinite S2S update loops).

I first tried creating a workflow field update that forcibly updated a (new, dummy) shared field when field B changed, but that still did not cause the update to flow back, presumably because it's in the same execution context which Salesforce deems exempt from re-sharing. Also tried a workflow rule that forwarded the Lead back to the connection queue when the field is changed, and it also didn't work.

I then tried a re-update statement in an AfterUpdate trigger - if the shared field is updated, reload and re-update the shared object. That also didn't work.

I did find a solution, which is a Future method called by the AfterUpdate trigger which reloads and touches any record that had its shared field changed by the BeforeUpdate trigger. This does cause the field results to show up in near-real-time in the originating organization.

This solution works for me for now, but I feel like I MUST be missing something. It causes way more Future calls and DML to be executed than should be necessary.

Does anyone have a more elegant solution for this?

1 Answer

0 votes
by (32.1k points)

Try following the below process if it works for you:

Org 1 - Field A is Updated, Publishes Contract

Org 2 - Before Update of Contract in Org 2; 

If A has been updated - Save ID of the Contract in NEW Custom Object. 

In After Update of NEW Custom Object, Update Field B for Given Contract ID.

Updates on B will be published

Welcome to Intellipaat Community. Get your technical queries answered by top developers!

28.4k questions

29.7k answers


94.1k users

Browse Categories