Apex loading scripts for SalesForce/SaleForce triggers
The College of Business (COB) also wants faculty scholarly data to appear in its SalesForce portal. This phase of
the project explores loading and synchronization issues when the target data source is in the cloud.
To synchronize the SalesForce data we need to add new records, updated existing records, and delete records that
no longer belong. The first two steps are neatly accomplished using the Upsert command available in SalesForce’s
Apex data loader.
Existing COB SalesForce workflows employ manual data uploads and manual deletion processes to keep lists
mostly up to date. We set out to create a more efficient and accurate synchronization process:
- A script file can process a set of data loading steps invoked by a single command. This is preferable to the
manual process currently in use.
- The standard upload tool does not easily allow us to create and perform process logic on uploaded data. We
want to create such logic and execute it as part of the scripted upload workflow.
SalesForce allows creation of what it calls triggers which can invoke code written in a programming language able
to interact stored objects. We realized that we could create a ‘command log’ table to control desired workflows.
We can, for instance, insert an object into our CommandLog object. That is loosely equivalent to inserting a row
into a table in a database context. This insertion triggers execution of a set of commands able to complete the
synchronization process. Our first ‘command’ captures the system date and time to facilitate further processing
without risking time-variation between platforms. New data is then Upserted in the tables. When all the upserts are
complete, another insert into the CommandLog fires a trigger which deletes any objects which were not touched
during the upsert phase, leaving the tables properly synchronized.
Another solution: A third party web service. Our favored solution (still under construction as of Spring 2012)
employs scripted loads and SalesForce triggers. However we explored other solutions as well. CHRISTIAN –
WHAT WAS THE OTHER TOOL CALLED?
One neat tool allowed us to run a client, which talked to a third-party service, which allowed us to accomplish the
synchronization process. It had several advantages – the scripting language was easy to use and it could be accessed
through a browser interface, features that might be crucial in some organizations. But, analysis of the volume-driven
pricing structure made this much less attractive for our application and setting.
Contributing students: Christian Ellison
The College of Business (COB) also wants faculty scholarly data to appear in its Salesforce portal. This phase of the project explores loading and synchronization issues when the target data source is in the cloud.
To synchronize the Salesforce data we need to add new records, updated existing records, and delete records that no longer belong. The first two steps are neatly accomplished using the ‘upsert’ command available in Salesforces’ Apex data loader.
Existing COB Salesforce workflows employ manual data uploads and manual deletion processes to keep lists mostly up to date. We set out to create a more efficient and accurate synchronization process: A script file can process a set of data loading steps invoked by a single command. This is preferable to the manual process currently in use.- The standard upload tool does not easily allow us to create and perform process logic on uploaded data. We want to create such logic and execute it as part of the scripted upload workflow.
Salesforce allows creation of what it calls triggers which can invoke code written in a programming language able to interact stored objects. We realized that we could create a ‘command log’ table to control desired workflows. We can, for instance, insert an object into our CommandLog object. That is loosely equivalent to inserting a row into a table in a database context. This insertion triggers execution of a set of commands able to complete the synchronization process. Our first ‘command’ captures the system date and time to facilitate further processing without risking time-variation between platforms. New data is then upserted in the tables. When all the upserts are complete, another insert into the CommandLog fires a trigger which deletes any objects which were not touched during the upsert phase, leaving the tables properly synchronized.
Another solution: A third party web service. Our favored solution (still under construction as of spring 2012) employs scripted loads and Salesforce triggers. However we explored other solutions as well.
One neat tool allowed us to run a client, which talked to a third-party service, which allowed us to accomplish the synchronization process. It had several advantages – the scripting language was easy to use and it could be accessed through a browser interface, features that might be crucial in some organizations. But, analysis of the volume-driven pricing structure made this much less attractive for our application and setting.
Contributing students: Christian Ellison