I have to load millions of records in a data migration process and I'm dealing with a parent object and a couple of custom child objects. After initial load, data will keep flowing, as Salesforce has to ingest data from an external loyalty system. I understood that batches within a job may run in parallel, or serially (only in parallel for Bulk API 2.0), but I don't know if the different jobs (one for each object) will run in parallel or not. I need the job for the parent object to complete before the other jobs start, because otherwise I would not be able to set the lookup fields.
1 Answer
If you are writing your own script, you can certainly chain them however you wish to.
Here is a simple Node.js script showing how to achieve this,
const jsforce = require('jsforce'); const sfbulk = require('node-sf-bulk2'); const util = require('util'); const fs = require('fs'); (async () => { if (process.env.username && process.env.password) { const conn = new jsforce.Connection({}); await conn.login(process.env.username, process.env.password); const bulkconnect = { 'accessToken': conn.accessToken, 'apiVersion': '51.0', 'instanceUrl': conn.instanceUrl }; try { // create a new BulkAPI2 class const bulkrequest = new sfbulk.BulkAPI2(bulkconnect); // create a bulk insert job const jobRequestAccountUpload = { 'object': 'Account', 'operation': 'insert' }; const response = await bulkrequest.createDataUploadJob(jobRequestAccountUpload); if (response.id) { // read csv data from the local file system const data = await util.promisify(fs.readFile)(process.cwd() + "/account.csv", "UTF-8"); const status = await bulkrequest.uploadJobData(response.contentUrl, data); if (status === 201) { // close the job for processing await bulkrequest.closeOrAbortJob(response.id, 'UploadComplete'); // Now start doing Contact upload // create a new BulkAPI2 class const bulkrequest = new sfbulk.BulkAPI2(bulkconnect); // create a bulk insert job for contact const jobRequestContactUpload = { 'object': 'Contact', 'operation': 'insert' }; const response = await bulkrequest.createDataUploadJob(jobRequestContactUpload); } } } catch (ex) { console.log(ex); } } else { throw 'set environment variable with your orgs username and password' } })(); You might need some sort of temporary table or have to use an externalID to make sure you relate Parent and child records.