Data loader is an amazingly intuitive tool provided by salesforce that lets you mass update/insert/upsert/export/delete data in your salesforce org. The traditional way of using data loader is through its provided interface where a csv file columns can be mapped to your sObject fields. But you can use Data loader Command Line interface to automate your data load job.


Learn the whole process and setup of command line interface with all DML operations by enrolling to my course Salesforce Integration : Data Loader command Line Interface

In addition to using Data Loader interactively to import and export data, you can run it from the command line. You can use commands to automate the import and export of data. For example, if you have a file coming from an external tool with data that you want to be auto updated in salesforce every night at 1AM without any human interaction with the file or the data loader then you can call your data loader operations with command line interface(CLI)

Use Case

We are going to see how to insert new accounts in Salesforce using the Data loader command line interface and will create & schedule the batch to run automatically at a given time.

Prerequisites

  • Data Loader installed on the computer that runs the command-line process.
  • The Java Runtime Environment (JRE) installed on the computer that runs the command-line process.
  • You should be familiar with importing and exporting data by using the Data Loader interactively through the user interface.

Step 1

Salesforce data loader is usually installed in the path Windows(c)/Program files/salesforce.com/Data loader
Make sure you have the path where data loader is installed. The bin folder is of importance here as it contains two batch files by the name of encrypt.bat and process.bat. We will call these batch using windows command line.

Create a folder – Data Job anywhere in your system where the accounts data will be placed to process and inside the folder create another folder by the name of Account Data Logs which will contain the success and error files that will be generated by the data loader after the operation is completed.

  1. Open a command prompt window by selecting Start | Run, enter cmd and click OK. Make sure you run command prompt as System Administrator of your machine.
  2. Navigate to the Data Loader \bin directory by entering this command. Replace the file path with the path from your system. cd C:\Users\{userName}\dataloader\version\bin
    In our case the command is cd C:\Program Files(x86)\salesforce.com\Data Loader\bin. Click Enter.
  3. Create an encryption key file by entering the following command. Replace [path to key file] with the key file path.encrypt.bat —k [path to key file]

    Your encryption file is now created. The path of your keyfile is prompted in the command prompt and Copy and paste the path in a notepad separately as we will need this path later on.

Step 2

Now its time to encrypt your salesforce password.

In the same command prompt window, enter the following command Replace <password> with the password that you use to log in to Salesforce in Data Loader. Replace <key file path> with the file path you created in the previous step.

encrypt.bat –e password key_File_Path

Once you click enter you will see a long string appearing in command prompt. This is your salesforce password in encrypted form. We will use this string in our process.config file later on so that if anyone sees the process.conf file then your real password will not be exposed to anyone.

Step 3

Create csv dummy file with accounts data for testing which will be used to insert the data in your org. Save the file in the folder we created by the name of Data Job.
Make sure you enter all the mandatory fields for accounts in your csv file

Then you create a mapping file with an .sdl file extension. In each line of the mapping file, pair a data source with its destination.
An easy way of doing it to use data loader interface to create mapping and save the mapping file by the name of accountInsertMap.sdl

Step 4


The process-conf.xml file contains the information that Data Loader needs to process the data. Each <bean> in the process-conf.xml file refers to a single process such as an insert, upsert, or export. Therefore, this file can contain multiple processes. In this step, you edit the file to insert accounts into Salesforce.

  1. Make a copy of the process-conf.xml file from the salesforce.com\Data loader\samples\conf directory. Be sure to maintain a copy of the original because it contains examples of other types of Data Loader processing such as upserts and exports.
  2. Open the file in a text editor, and replace the contents with the following XML
<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">
<beans>
    <bean id="accountInsert"
        class="com.salesforce.dataloader.process.ProcessRunner"
        singleton="false">
        <description>accountInsert job gets the account record from the CSV file 
            and inserts it into Salesforce.</description>
        <property name="name" value="accountInsert"/>
        <property name="configOverrideMap">
            <map>
                <entry key="sfdc.debugMessages" value="true"/>
                <entry key="sfdc.debugMessagesFile" 
                    value="C:\DLTest\Log\accountInsertSoapTrace.log"/>
                <entry key="sfdc.endpoint" value="https://servername.salesforce.com"/>
                <entry key="sfdc.username" value="admin@Org.org"/>
                <!--Password below has been encrypted using key file, 
                    therefore, it will not work without the key setting: 
                    process.encryptionKeyFile.
                    The password is not a valid encrypted value, 
                    please generate the real value using the encrypt.bat utility -->
                <entry key="sfdc.password" value="e8a68b73992a7a54"/>
                <entry key="process.encryptionKeyFile" 
                    value="c:\Users\{user}\.dataloader\dataLoader.key"/>
                <entry key="sfdc.timeoutSecs" value="600"/>
                <entry key="sfdc.loadBatchSize" value="200"/>
                <entry key="sfdc.entity" value="Account"/>
                <entry key="process.operation" value="insert"/>
                <entry key="process.mappingFile" 
                    value="C:\DLTest\Command Line\Config\accountInsertMap.sdl"/>
                <entry key="dataAccess.name" 
                    value="C:\DLTest\In\insertAccounts.csv"/>
                <entry key="process.outputSuccess" 
                    value="c:\DLTest\Log\accountInsert_success.csv"/>
                <entry key="process.outputError" 
                    value="c:\DLTest\Log\accountInsert_error.csv"/>
                <entry key="dataAccess.type" value="csvRead"/>
                <entry key="process.initialLastRunDate" 
                    value="2005-12-01T00:00:00.000-0800"/>
            </map>
        </property>
    </bean>
</beans>

Modify the following parameters in the process-conf.xml file. For more information about the process configuration parameters, see Data Loader Process Configuration Parameters.

  • sfdc.endpoint—Enter the URL of the Salesforce instance for your organization; for example, https://yourInstance.salesforce.com/.
  • sfdc.username—Enter the username Data Loader uses to log in.
  • sfdc.password—Enter the encrypted password value that you created in step 2.
  • process.mappingFile—Enter the path and file name of the mapping file.
  • dataAccess.Name—Enter the path and file name of the data file that contains the accounts that you want to import.
  • sfdc.debugMessages—Currently set to true for troubleshooting. Set to false after your import is up and running.
  • sfdc.debugMessagesFile—Enter the path and file name of the command line log file.
  • process.outputSuccess—Enter the path and file name of the success log file.
  • process.outputError—Enter the path and file name of the error log file.

Save the file in the Data Job folder once edited.

Step 5

In the command prompt window, enter the following command:

process.bat "<file path to process-conf.xml>" <process name>

  • Replace <file path to process-conf.xml> with the path to the directory containing process-conf.xml.
  • Replace <process name> with the process specified in process-conf.xml.

In our case the command becomes :

cd C:\Program Files(x86)\salesforce.com\Data Loader\bin
Hit Enter.
cd C:\Program Files(x86)\salesforce.com\Data Loader\bin>process.bat “C:\Program Files(x86)\salesforce.com\Data Job” accountInsert
Hit Enter.

After the process runs, the command prompt window displays success and error messages. You can also check the log files: insertAccounts_success.csv and insertAccounts_error.csv. After the process runs successfully, the insertAccounts_success.csv file contains the records that you imported, along with the ID and status of each record.

You can now check your org to see these new accounts.

Learn the whole process and setup of command line interface with all DML operations by enrolling to my course Salesforce Integration : Data Loader command Line Interface


0 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *