Salesforce Data Loader β Quick Reference:
- Import Wizard: β€50,000 records, browser-based, limited objects
- Data Loader: 50,000β5,000,000 records, desktop client, all objects, SOAP or Bulk API
- Bulk API 2.0: 5,000,000+ records, async processing, job-based monitoring
- Error log location: your output directory as
[operation]_error.csvand[operation]_success.csv - Most common errors: malformed CSV (escaped quotes), lookup failures (Name vs External ID), UNABLE_TO_LOCK_ROW (concurrent imports)
Quick Answer
Salesforce Data Loader produces different error types than the Import Wizard β it processes at batch level, uses SOAP or Bulk API, and writes detailed error CSV files that the Import Wizard doesn't produce.
Why it matters: Data Loader errors require understanding which API you're using (SOAP vs Bulk API 1.0 vs Bulk API 2.0) because error format, batch behavior, and recovery options differ between them.
The fix: Read the _error.csv output file first. Identify the error type (FIELD_INTEGRITY_EXCEPTION, CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY, UNABLE_TO_LOCK_ROW, MALFORMED_ID). Apply the fix to the source CSV. Re-run with a small test batch before the full file.
Root cause: Data Loader validates every field value against Salesforce's schema on write. Errors that the Import Wizard would catch before the mapping screen become row-level failures in Data Loader's error log.
Fast Fix (Before Your Next Data Loader Run)
β± Expected time by error type:
- FIELD_INTEGRITY_EXCEPTION (picklist): 10β15 minutes (export values, find-replace)
- Malformed CSV (quotes/commas): 15β20 minutes (text editor repair)
- Lookup field mismatch: 20β30 minutes (export related records, VLOOKUP GUID)
- UNABLE_TO_LOCK_ROW: 5 minutes (re-run error.csv after brief wait)
- Permission error: admin dependent (not a file fix)
If your Data Loader import just failed:
- Open
[operation]_error.csvin your Data Loader output directory β every failed row is here with the error code and message - Group errors by type β sort by the ERROR column to find the most common failure
- Fix the source CSV for structural and field-value errors; fix Salesforce configuration for permission and relationship errors
- Run a 100-row test batch before re-running the full file
- Validate locally first β run your file through Data Validator to catch format errors before Data Loader even sees the file
If Data Loader froze or timed out mid-import: Check Settings β Data Management β Bulk Data Load Jobs to see which batches completed. You can continue from the last successful batch rather than re-running from the start.
TL;DR: Salesforce Data Loader errors live in _error.csv in your output directory. The most common failures are malformed CSV (unescaped quotes), lookup field mismatches (Name vs External ID vs Internal ID), and row locking conflicts on concurrent imports. Use Data Validator to catch the structural issues before Data Loader runs β your Salesforce data never leaves your browser.
Your Data Loader job ran for 12 minutes. 47,000 records succeeded. 3,241 failed. The error file is 8MB. You open it and see this:
ID,ERROR
,FIELD_INTEGRITY_EXCEPTION;Lead Status;bad value for restricted picklist field Lead Status: Hot Lead
,CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY;Contact;TRANSFER_REQUIRES_READ on 0031x000003ABC
,MALFORMED_ID;Account ID;Account ID: id value of incorrect type: 001XX000003GYn2YAG123
,UNABLE_TO_LOCK_ROW;Opportunity;unable to obtain exclusive access to this record or 1 records: 006XX000003ABC
Four different error types. All requiring different fixes. This guide maps each one.
Error strings verified against Salesforce Help documentation and Salesforce Developer documentation, March 2026.
What Salesforce Import Tool to Use: The Decision Branch
SALESFORCE IMPORT PATH β CHOOSE BEFORE YOU START:
Records to import β Tool β Notes
β€50,000 records
β Data Import Wizard (browser-based)
β Supports: Contacts, Leads, Accounts, Opportunities (standard objects)
β No install required; dedup options built in
β β Does NOT support custom objects
β Error handling: pre-import validation only; no row-level error log
50,001β5,000,000 records
β Data Loader (free desktop client, requires Java)
β Supports: ALL standard + custom objects
β Uses SOAP API by default (200 records/batch)
β Switch to Bulk API in Settings for better performance at scale
β Error handling: full row-level error.csv after each operation
β Download: Salesforce Setup β Data β Data Loader
5,000,000+ records
β Bulk API 2.0 (via Data Loader "Use Bulk API" setting, or direct API calls)
β Async processing in background jobs
β Monitor via Setup β Environments β Jobs β Bulk Data Load Jobs
β Best for migrations; not suitable for real-time or urgent imports
β PK Chunking available for very large datasets
Rule: if you're using Data Loader, you've already determined the Import Wizard
won't work. Data Loader is the right call for custom objects and files over 50K.
Table of Contents
- What Salesforce Import Tool to Use: The Decision Branch
- Reading the Data Loader Error Log
- Fix 1: FIELD_INTEGRITY_EXCEPTION (Picklist and Field Validation)
- Fix 2: Malformed CSV (Escaped Quotes and Special Characters)
- Fix 3: Lookup Field Failures (Name vs External ID vs Internal ID)
- Fix 4: UNABLE_TO_LOCK_ROW (Concurrent Import Conflicts)
- Fix 5: Permission and Access Errors
- Data Loader Configuration for Large Files
- Additional Resources
- FAQ
Reading the Data Loader Error Log
Data Loader writes two files to your output directory after every operation:
β BROKEN β actual Data Loader _error.csv content (what you open after a failed run):
"ID","ERROR","First Name","Last Name","Email","Lead Status","Account Name","Annual Revenue"
"","FIELD_INTEGRITY_EXCEPTION;Lead Status;bad value for restricted picklist field Lead Status: Hot Lead","Alice","Chen","[email protected]","Hot Lead","Acme Corp","50000"
"","CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY;Contact;TRANSFER_REQUIRES_READ on 0031x000003ABC","Bob","Smith","[email protected]","Open","Acme Corp","75000"
"","INVALID_CROSS_REFERENCE_KEY;AccountId;No such value 'ACME Corp (Old)' for relationship 'Account'","Carol","Jones","[email protected]","Open","ACME Corp (Old)","90000"
"","UNABLE_TO_LOCK_ROW;Opportunity;unable to obtain exclusive access to this record or 1 records: 006XX000003ABC","Dave","Lee","[email protected]","Qualified","Acme Corp","120000"
Key observations:
- ID column: blank = record was never created (re-importable)
- ERROR column: error code; object; description β three semicolon-separated fields
- All original columns follow β this IS your re-import file after fixing the values
- Sort by ERROR column to group identical failures (all "Hot Lead" rows = same fix)
FIXED β the corrected rows to re-import:
"First Name","Last Name","Email","Lead Status","Account Name","Annual Revenue"
"Alice","Chen","[email protected]","Hot","Acme Corp","50000" β picklist value corrected
"Carol","Jones","[email protected]","Open","Acme Corp","90000" β account name corrected
[Bob and Dave rows: require Salesforce admin action β not file fixes]
Workflow for large error files:
- Open error.csv in Excel or a text editor
- Sort by the ERROR column to group identical error types
- Fix each error type as a batch (all "FIELD_INTEGRITY_EXCEPTION" for Lead Status = same fix)
- Merge fixes back into original source file
- Re-run only the failed rows (the error.csv is a valid import file for re-running)
Fix 1: FIELD_INTEGRITY_EXCEPTION (Picklist and Field Validation)
FIELD_INTEGRITY_EXCEPTION is the most common Data Loader error and covers picklist value mismatches, required field violations, and field-level validation rule failures.
β FIELD_INTEGRITY_EXCEPTION EXAMPLES:
,FIELD_INTEGRITY_EXCEPTION;Lead Status;bad value for restricted picklist field Lead Status: Hot Lead
β "Hot Lead" is not a valid picklist value. Get exact values from Setup β Object Manager β Lead β Lead Status field.
,FIELD_INTEGRITY_EXCEPTION;Rating;bad value for restricted picklist field Rating: warm
β Case mismatch. The valid value is "Warm" not "warm".
,FIELD_INTEGRITY_EXCEPTION;Phone;value not in correct format
β Phone number contains characters Salesforce's field-level validation rejects.
,FIELD_INTEGRITY_EXCEPTION;Annual Revenue;value not in correct format
β Currency symbol or comma in a numeric field. Strip to plain number.
FIELD_INTEGRITY_EXCEPTION FIX WORKFLOW:
1. Note the field name (e.g., "Lead Status")
2. Go to Setup β Object Manager β [Object] β Fields & Relationships β [Field]
3. For picklist fields: view the "Values" list β copy exact labels
4. For number/currency fields: remove all non-numeric characters (symbols, commas)
5. Run find-replace on your CSV for all occurrences of the invalid value
For existing picklist error guides, see Salesforce bad value for restricted picklist.
Fix 2: Malformed CSV (Escaped Quotes and Special Characters)
Data Loader's CSV parser is strict β malformed rows cause the entire batch to fail or produce unpredictable results.
β MALFORMED CSV EXAMPLES:
Symptom: rows appear shifted by one column in the error log
Cause: unescaped comma inside a field value
Source CSV row (broken):
Name,Description,Amount
"Acme Corp","Sells widgets, gadgets, and components",10000
Result in Data Loader: "Sells widgets" β Name field, " gadgets" β Description, ...
All columns shift because the parser sees commas inside the Description as delimiters.
Fix: ensure all field values containing commas are wrapped in double quotes.
AND: if the field value itself contains double quotes, escape them as ""
CORRECT format:
"Acme Corp","Sells widgets, gadgets, and components",10000
"Smith ""The Builder"" LLC","Construction",50000
βescaped internal quote β
Another common malformed case:
Source contains line breaks inside a field value (address with newlines)
"123 Main St
Suite 400
New York NY","Company",10000
Fix: replace in-field newlines with a space or a different separator character.
Data Loader's parser sees the newline as a row break.
Pre-import check: Open your CSV in Notepad++ (Windows) or BBEdit (Mac). Enable "Show all characters" to make spaces, tabs, and non-printing characters visible. Any invisible character inside a field value will cause parsing issues.
Fix 3: Lookup Field Failures (Name vs External ID vs Internal ID)
Lookup fields in Salesforce reference another record. Data Loader supports three formats for lookup values β Name, External ID, and Internal Salesforce ID (18-character). Using the wrong format is one of the most common Data Loader failures.
β LOOKUP FIELD FAILURES:
Scenario: importing Contacts with a related Account ID column
Method 1 β Account Name (display name):
AccountId column value: "Acme Corporation"
β Works ONLY if exactly one Account exists with that name
β Fails if: name has slight variation, multiple accounts with similar names,
or Account doesn't exist
Method 2 β Salesforce Internal ID (18-char):
AccountId column value: "001XX000003GYn2YAG"
β Most reliable for Import Wizard
β Fails in Data Loader if the ID format has extra/missing characters
Method 3 β External ID (custom field):
Account.External_ID__c column value: "ACME-001"
β Most robust for large imports
β Requires creating an External ID field on the Account object
β Column header format: ObjectName.ExternalFieldName
RECOMMENDED FOR BULK IMPORTS:
Use External IDs for relationship fields when importing 50K+ records.
Export related objects with their External IDs, VLOOKUP into your import file,
then map using the ObjectName.ExternalFieldName column header.
BEST PRACTICE COLUMN HEADER FORMAT:
Instead of: AccountId
Use: Account.External_ID__c (if using external ID lookup)
Error message for lookup failure:
"INVALID_CROSS_REFERENCE_KEY; AccountId; Account ID: id value of incorrect type: [value]"
Fix 4: UNABLE_TO_LOCK_ROW (Concurrent Import Conflicts)
UNABLE_TO_LOCK_ROW errors occur when Data Loader tries to update a record that is simultaneously being modified by another process β another import job, a workflow rule, a trigger, or a user editing the record.
β UNABLE_TO_LOCK_ROW:
Error message:
"UNABLE_TO_LOCK_ROW; Opportunity; unable to obtain exclusive access to this
record or 1 records: 006XX000003ABC"
Root causes:
1. Another Data Loader job running simultaneously on the same object
2. Workflow rules or Process Builder updating related records during import
3. Salesforce triggers firing on insert/update and causing record conflicts
4. User editing a record while import is writing to it
FIXES (in order of likelihood):
1. Wait and re-run: lock errors are often transient β re-importing the error.csv
file often resolves a large portion of UNABLE_TO_LOCK_ROW failures (commonly observed in practice; results depend on concurrent activity)
2. Reduce batch size: Settings β Batch Size β reduce from 200 to 50-100
Smaller batches reduce the window for concurrent conflicts
3. Import during off-hours: schedule large Data Loader jobs when user activity
is low to minimize concurrent edits
4. Disable automation temporarily: if workflow rules or Process Builder are
triggering on import, coordinate with admin to disable during import window
5. Use Bulk API: Bulk API processes records asynchronously and handles lock
conflicts differently than SOAP API
IDENTIFYING WHICH RECORDS TO RE-RUN:
The error.csv from the original run contains all failed rows.
Re-run Data Loader using error.csv as the source file (same columns).
This re-processes only the failed rows, not the full dataset.
Fix 5: Permission and Access Errors
β PERMISSION ERRORS IN DATA LOADER:
"CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY; Contact; TRANSFER_REQUIRES_READ on 0031x000003ABC"
β The importing user doesn't have Read access to the records being assigned to a new owner.
β Fix: Grant "Transfer Record" permission in the user's Profile/Permission Set,
OR remove the Owner column and let Salesforce default to the importing user.
"INSUFFICIENT_ACCESS_OR_READONLY; [field]; insufficient access rights on object id"
β The importing user's profile has read-only access to a field being imported.
β Fix: Ask admin to update Field-Level Security for that field to allow editing,
OR remove the read-only column from the import file.
"FIELD_CUSTOM_VALIDATION_EXCEPTION; [field]; [validation rule message]"
β A validation rule on the object is rejecting the field value.
β Fix: Fix the field value to satisfy the validation rule,
OR ask admin to temporarily deactivate the validation rule during import.
"REQUIRED_FIELD_MISSING; [field]"
β A required field has no value in the import row.
β Fix: Add the required field column to your CSV with appropriate values.
Data Loader Configuration for Large Files
For imports over 100,000 records, these Data Loader settings significantly improve performance and reliability:
DATA LOADER SETTINGS β RECOMMENDED FOR LARGE IMPORTS:
Settings β Settings dialog:
Batch Size: 200 (default) is fine for SOAP API
For Bulk API: 2,000-10,000 records per batch
Use Bulk API: Enable for imports over 50,000 records
Settings β Settings β Use Bulk API β checked
Bulk API Parallel Mode: Enable for independent records (no dependencies)
Disable if records depend on each other within the batch
Assignment Rule: Check if you want Lead assignment rules to fire on import
Uncheck if you want records to go directly to the owner column value
Time Zone: Set to match your Salesforce org's default timezone to prevent
date field off-by-one errors
Encoding: UTF-8 for most imports; Latin-1 for legacy systems with accented characters
API Version: Use the latest available β older API versions may reject newer field types
β How to Confirm Your Data Loader Import Worked
Data Loader's "success" output doesn't guarantee all fields are populated correctly:
- Check _success.csv row count β should match your intended import minus any known failures. If the count is lower than expected, rows were silently skipped.
- Check _error.csv β if this file is empty, no hard failures occurred. If it has rows, those records need re-processing.
- Open 5 records from _success.csv in Salesforce β search by the Salesforce ID in the success file. Verify the specific fields that were failing previously (picklist, owner, related record) now show correct values.
- Run a Salesforce report β for large imports, run a quick report filtered to records created/modified in the last hour. Spot-check field values across 10β20 records.
π Safe Batch Re-Run Workflow
After a partial failure, re-process only the rows that failed:
CORRECT re-run workflow:
Step 1: Open your Data Loader output directory
Step 2: Locate [operation]_error.csv (e.g., insert_error.csv)
Step 3: Fix the field values in the ERROR rows (the original columns are included)
Step 4: Use the corrected error.csv as your new Data Loader source file
Step 5: Run the same operation (insert/update/upsert)
WHY THIS IS SAFE:
- _error.csv contains only rows that FAILED β no duplicate risk
- _success.csv rows already have Salesforce IDs β don't re-process them
- The operation type matches: if you ran an Insert, re-run an Insert on error rows
WRONG approach (creates duplicates):
- Re-uploading the full original file after partial success
- Running Insert on rows that already succeeded β duplicate records
β οΈ Edge Case: Excel auto-converts long numeric IDs (account numbers, external IDs) to scientific notation: 1.23E+11. When Data Loader reads this from a CSV saved in Excel, the ID becomes 123000000000 β truncated and wrong. This causes INVALID_CROSS_REFERENCE_KEY errors on every row containing the converted value. Fix: format the column as Text before saving from Excel, or save as CSV directly from a text editor.
Why Validate Before Data Loader for Salesforce Data
Salesforce import files β whether for Contacts, Leads, or custom objects β routinely contain personally identifiable information, financial data, and proprietary business records. Many teams run these files through cloud-based validation or cleaning tools before import, creating a secondary data exposure point that may require its own DPA under GDPR Article 28.
SplitForge validates files in Web Worker threads in your browser β for raw file contents, nothing is transmitted to any server during processing. You can verify this in Chrome DevTools: Network tab β Fetch/XHR filter β upload file β confirm no POST request containing file data appears.
For the complete cross-platform error reference, see CRM import error codes explained. For Salesforce picklist errors specifically, see Salesforce bad value for restricted picklist. For the complete CRM import failure guide, see CRM import failures complete guide.
Top 10 Data Loader Errors β Cheat Table
Bookmark this. Match your error code, get the fix path.
| Error Code | Object | Root Cause | Fix |
|---|---|---|---|
FIELD_INTEGRITY_EXCEPTION | Any | Picklist value doesn't match exactly | Export exact values from Setup β Object Manager β Fields |
INVALID_TYPE | Any | Wrong data type (text in number, comma in currency) | Strip non-numeric from number fields; fix date format |
UNABLE_TO_LOCK_ROW | Any | Concurrent record edit conflict | Re-run error.csv after ~15 min; reduce batch size; import off-hours |
CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY | Contact/Lead | Owner transfer requires Read access | Grant Transfer Record permission; or remove owner column |
INSUFFICIENT_ACCESS_OR_READONLY | Any | Importing user lacks write access to field | Ask admin to update field-level security |
INVALID_CROSS_REFERENCE_KEY | Any | Relationship lookup value doesn't match | Use GUID format for lookups; verify related record exists |
DUPLICATE_VALUE | Any | Duplicate rule triggered | Configure duplicate handling; deduplicate file first |
REQUIRED_FIELD_MISSING | Any | Required field has no value in row | Add required field column with valid values |
MALFORMED_ID | Any | Salesforce ID has wrong format/length | Use 18-character IDs; check for truncation |
FIELD_CUSTOM_VALIDATION_EXCEPTION | Any | Validation rule blocking value | Fix value to satisfy rule; ask admin to review rule |
Broken CSV Row β What Malformed Looks Like
This is the most common cause of "Can't read from CSV" or batch-level failures in Data Loader:
β BROKEN β unescaped quotes and embedded commas causing row shift:
CSV file content (viewed in text editor):
Name,Description,Annual Revenue,Close Date
"Acme Corp","Provides widgets, gadgets and solutions",50000,2026-03-31
"Smith "The Builder" LLC","Construction services",120000,2026-06-15
"Jones & Co",Services include: design, build, maintain,75000,2026-09-30
Problems:
Row 2: unescaped double quote in "Smith "The Builder" LLC"
β Data Loader sees: field ends at 'Smith ', new field starts '"The Builder"'
β All subsequent columns shift by 2 positions
β Annual Revenue field gets "LLC" β INVALID_TYPE error
Row 3: commas inside field value not wrapped in quotes
β "Services include: design" β parsed as Description column
β "build" β parsed as Annual Revenue β INVALID_TYPE error
β "maintain" β parsed as Close Date β INVALID_DATE_FORMAT error
FIXED β proper CSV quoting:
Name,Description,Annual Revenue,Close Date
"Acme Corp","Provides widgets, gadgets and solutions",50000,2026-03-31
"Smith ""The Builder"" LLC","Construction services",120000,2026-06-15
"Jones & Co","Services include: design, build, maintain",75000,2026-09-30
Rules:
1. Wrap ANY field value containing commas in double quotes
2. Escape internal double quotes as "" (two consecutive double quotes)
3. Field values with line breaks must also be wrapped in double quotes
4. Use a text editor with "Show All Characters" to spot invisible issues
π‘ Prevent This Next Time
Data Loader failures are almost always the same three errors repeated until the workflow changes:
- Always open _error.csv, not just _success.csv β the success file shows what worked. The error file shows what needs fixing. Build a habit of opening both after every run.
- Use External IDs for relationship fields β if you're importing anything with relationships (parent account, related contact), create External ID fields on those objects and use them for lookup. Name-based lookups fail on any variation; GUID-based lookups fail only when the record doesn't exist.
- Test with 200 rows before the full file β Data Loader processes in batches of 200 by default. Run one batch manually, check both output files, confirm results before processing the full dataset.
Additional Resources
Tested: Data Loader error codes and behavior verified against Salesforce Help documentation and Salesforce Developer documentation. Bulk API 2.0 behavior verified against Salesforce Bulk API 2.0 Developer Guide. March 2026.
Official Salesforce Documentation:
- Salesforce Help: Data Loader β Official Data Loader guide
- Salesforce Help: Choose the right tool for importing data β Import Wizard vs Data Loader decision guide
- Salesforce Bulk API 2.0 Developer Guide β Complete Bulk API 2.0 reference
CSV Standards:
- RFC 4180 β Common Format for CSV Files β Why proper quoting and escaping matters for Data Loader
Privacy:
- GDPR Article 28 β Processor obligations β Applies when validation tools receive your Salesforce import files containing personal data
FAQ
When should I use Data Loader instead of the Import Wizard?
Use Data Loader when: you need to import more than 50,000 records; you're importing custom objects (Import Wizard doesn't support them); you need to import data into objects not listed in the Import Wizard; or you need an automated import as part of a scripted process. Data Loader supports all standard and custom objects via the SOAP API and Bulk API.
What is the difference between SOAP API and Bulk API in Data Loader?
SOAP API processes records synchronously in small batches (default 200 records per batch). It provides immediate feedback per row and is suitable for imports under 100,000 records. Bulk API processes records asynchronously in larger batches β the job runs in the background while you continue working, and you check the results after completion. Bulk API is required for the best performance at 500,000+ records. Enable Bulk API in Data Loader's Settings dialog.
How do I re-run only the failed rows after a Data Loader import?
The _error.csv output file contains every failed row in the same column format as your original source file. You can use this file directly as the input for a new Data Loader run β it will attempt to import only the previously failed rows. This avoids re-processing the 47,000 rows that already succeeded.
Why does Data Loader show "Batch Failed" instead of row-level errors?
"Batch Failed" typically occurs when the entire batch (group of records) fails rather than individual rows. This happens with: malformed CSV that prevents parsing, API connection timeouts, or Salesforce system errors during processing. Reduce your batch size in Settings to isolate failures to smaller groups. If the batch size is already small and the error persists, check your CSV for malformed rows (unescaped quotes, line breaks inside field values).
Can I run Data Loader from the command line for automation?
Yes. Data Loader includes a command-line interface (CLI) mode for automation via scripts. The CLI uses the same configuration files as the GUI. This is the standard approach for scheduled nightly imports, ETL pipelines, and CI/CD data operations. See the Salesforce Data Loader CLI documentation for setup and required configuration files.
Catch
Bulk Import Errors Before They Reach Salesforce
β Validates picklist values, date formats, numeric fields, and required fields β Detects malformed rows before Data Loader batch processing begins β Processes 100K+ row Salesforce import files β no upload, no size limit β Files process locally in your browser β Salesforce data never transmitted to any server