Data Lake to Salesforce
Last updated
Last updated
Are you grappling with the challenge of consolidating customer data from various sources and seamlessly incorporating it into Salesforce? If so, you’re not alone. The process can be complex, especially for those without a technical background. Fortunately, Dataworkz is here to simplify the entire procedure for business users.
We’ll guide you through the steps to effortlessly move data from a Data Lake like Google Cloud Storage (GCS) to Salesforce. By leveraging Dataworkz, you can efficiently combine customer data dumps from multiple sources and effortlessly push them into Salesforce. Better yet, by scheduling a recurring job, you’ll always have an up-to-date view of all your customer data within the Salesforce platform.
Let’s break down the process into three simple steps:
Begin by configuring a new Salesforce connection to the GCS bucket from which you need to enable write backs.
Select your preferred authentication method.
Add the workspace and collection for the Salesforce app.
Upon completing the process, you will receive a prompt to sign in to your Salesforce developer account, and Dataworkz will securely save your configuration.
Once you return to the Dataworkz salesforce configuration page, you should see your salesforce connection:
Once connected, head to the Configuration tab. Click on the configuration name that has been created, and go to the writeback permissions tab. Here you can select the SFDC objects that you wish to support writeback. Follow these steps:
From the drop down select the SFDC objects for which you want to support writeback
Once you have configured the writeback permissions, click save
To write the datasets within the workspace and collection back to Salesforce, follow these simple steps:
Go to the GCS dataset within Dataworkz that you want to write back to Salesforce
Find the “Transform” option in the top right of the dataset screen, where you can perform any necessary transformation functions on the dataset before pushing it to SFDC
After defining all of the transformations, click “Execute” in the top right corner and define the target.
Find the SaaS destination workspace, and the collection will be the name of the salesforce configuration that was given, in this case ‘dw_writebacks_sfdc’.
Map the unique identifier key, and then map all existing columns from the dataset that you would like to move to salesforce
Choose the frequency on which you want the job to run – you can either run this as a one time job, or you can run this as a recurring job. For this example we set the recurring frequency to be once every 24 hours
After completing the process, click “submit,” and Dataworkz will write your data back into Salesforce. Because the job is configured as recurring, any new data that arrives will automatically be written back to the SFDC entity once per day.
Utilize the monitoring screen to retrieve details about all the records written to Salesforce. In the event of errors, Dataworkz captures the detailed error message along with the data passed to Salesforce for follow-up action. In this example, 5 out of the 9 records being updated in Salesforce failed, and you can view the input message along with the detailed error message for one of the failed records.
By following these steps, you can seamlessly integrate your data between GCS and Salesforce, ensuring a smooth and up-to-date flow of information. Dataworkz empowers business users to manage this process efficiently, making Salesforce data integration a breeze no matter the data source.