Navigated to blog › crm-import-file-size-limits
Back to Blog
crm-import-guides

CRM Import File Size Limits: What Every Platform Allows (2026)

March 21, 2026
12
By SplitForge Team

Quick Answer

Every major CRM imposes import limits — either on file size, row count per batch, or both. The most important limit to know: Salesforce's Data Import Wizard caps at 50,000 records per import. HubSpot has no hard row limit but enforces a 10MB file size ceiling. Zoho CRM Standard allows 10,000 records per import. Pipedrive caps at 50,000 rows.

Why it matters: Exceeding these limits causes your import to fail silently or with a cryptic error. Knowing the limit before you start lets you split your file into compliant batches rather than troubleshoot after failure.

The fix: Split your CSV into batches that stay below each platform's row limit. Use a local CSV splitter to avoid uploading oversized files to cloud splitting services.

What differs by platform: Some CRMs limit by row count, some by file size, some by both. A few enforce different limits based on your subscription tier.


How to Know You've Hit the Limit

Different CRMs signal limit breaches differently. Match your symptom to the enforcement type before troubleshooting.

Symptom → What it means → What to do

"Your import file contains too many records"
  → Hard row limit (Salesforce Wizard, Pipedrive)
  → Split the file and reimport. No records were created.

"File size exceeds the limit"
  → File size cap (HubSpot 10MB)
  → Compress the file or split into smaller batches.

Import shows "Completed" but record count is lower than expected
  → Timeout-based truncation
  → Check CRM import logs. Re-import missing rows from that batch.

Import spinner runs, then fails with no error
  → API timeout or server overload
  → Reduce batch size by 50% and retry.

Import succeeds but some records are missing entirely
  → Silent truncation (rare — check your CRM's behavior)
  → Verify: export from CRM immediately after import and compare row counts.

Rule of thumb: If you can't explain exactly what happened to every row in your file, something went wrong. Always verify row counts in the CRM immediately after each batch import.


Fast Fix (60 Seconds)

If your import was just rejected for size, try this first:

  1. Check your row count — Open the CSV and check the row count (excluding headers). Compare against the platform limits table below.
  2. Split at 80% of the limit — If the limit is 50,000 rows, split into batches of 40,000. This leaves headroom for rows that fail validation without triggering the size limit.
  3. Split locally — Use CSV Splitter to split by row count in your browser. No upload required.
  4. Import batches in sequence — Most CRMs don't deduplicate across batches automatically; import with deduplication enabled if available.
  5. Keep a log — Note which batch numbers imported successfully so you can identify and reimport failed batches without duplicating successful ones.

TL;DR: Salesforce Data Import Wizard caps at 50,000 records; use Data Loader for larger volumes. HubSpot's limit is primarily file size (10MB ceiling). Zoho Standard allows 10,000 per import. Pipedrive is 50,000. Split your file locally with CSV Splitter before any large import — it runs in your browser and handles files with millions of rows.


You're loading 200,000 contact records into your CRM after a system migration. You upload the file and get an error: "Import failed" or "File too large" with no indication of what the actual limit is. You're not sure whether the problem is file size in megabytes, row count, or both.

Every CRM platform has limits — and almost none of them display those limits prominently inside the import interface. You typically discover them by hitting them.

CRM import files often contain customer contact data, deal history, and account information — all personal data under GDPR Article 5(1)(c). When you use a cloud-based CSV splitter to break a large file into batches, you're uploading your full dataset to a remote server before it even reaches the CRM. SplitForge's CSV Splitter processes in Web Worker threads entirely in your browser. Nothing is transmitted. Verify this with Chrome DevTools Network tab — zero outbound requests during splitting.

Platform limits in this guide were verified against official documentation, March 2026. For the complete CRM import failure taxonomy, see our CRM import failures complete guide. If you're hitting limits on large migrations specifically, see Why Your CRM Rejects CSV Imports for structural causes beyond size. Teams migrating entire CRM databases will find the full workflow in CRM Data Migration Guide.


CRM Import Limits: Platform Reference Table

This is the table you'll bookmark. Verified against official documentation — see source blocks below each entry.

PlatformToolMax Rows Per ImportFile Size LimitNotes
SalesforceData Import Wizard50,000Not stated (practical: <100MB)Contacts, Leads, Accounts only; no custom objects
SalesforceData LoaderNo UI limitNo UI limit5M+ records via Bulk API 2.0; requires install
HubSpotImport ToolNo published hard cap10MB per fileLarge files time out in practice; batch at 50K rows
PipedriveImport Tool50,000Not statedApplies to Contacts, Deals, Organizations
Zoho CRMImport Tool10,000 (Standard)Not stated20,000+ on Enterprise; 50,000 on Ultimate; check your edition
Dynamics 365Import Wizard20,000 per file8MBPower Platform limits apply for Dataverse imports
ActiveCampaignImport Tool50,000Not statedContact imports; larger lists should use API
Copper CRMImport ToolNo published hard limitNot statedPractical limit reported around 5,000–10,000 rows

How to use this table: Find your platform, check the Max Rows limit, then split your file so each batch stays 10–20% below that ceiling. The buffer accounts for rows that fail validation — you want the valid rows to stay under the limit even after failures are counted.

Universal Import Tool Decision Shortcut

How many records do you need to import?

< 50,000  → Use your CRM's native browser-based import tool
             (Salesforce Wizard, HubSpot Import, Pipedrive, Zoho, etc.)

50,000–5,000,000 → Use a desktop client or bulk import tool
             (Salesforce Data Loader, API-based tools)
             Batch into [platform limit × 0.8] row files

5,000,000+ → Use the platform's API directly (Bulk API 2.0 for Salesforce,
             HubSpot API, Zoho API, etc.)
             CSV import tools are not designed for this scale.

If you're on Zoho Standard with > 10,000 records:
  → Batch into ≤ 8,000 rows (80% of 10,000) per import
  → Or upgrade to Professional (20,000 per import) for fewer batches

PLATFORM SPECIFICATION SOURCE
Platform: Multiple CRMs (Salesforce, HubSpot, Pipedrive, Zoho, Dynamics 365)
Sources:
  Salesforce: help.salesforce.com — "Import My Accounts and Contacts" + Data Loader guide
  HubSpot: knowledge.hubspot.com — "Import files into HubSpot"
  Pipedrive: support.pipedrive.com — "Import data to Pipedrive"
  Zoho: help.zoho.com/crm — "Import Records"
  Dynamics 365: learn.microsoft.com — "Import data using the Import Wizard"
Verified: March 2026
Next re-verify: June 2026

Platform limits change with product updates and subscription tier changes.
Always verify against official documentation before a large import operation.
The table reflects standard/professional tier limits where tiered limits apply.

Table of Contents


This guide is for: CRM admins and operations teams importing large datasets that exceed standard import limits.


Why Import Limits Exist and How They're Enforced

CRM import limits exist to protect platform stability. A 500,000-row import running synchronously would time out web requests, exhaust server memory, and block other users on shared infrastructure. Most CRMs process imports asynchronously in the background — but even async jobs have queue depth limits and processing time ceilings.

Enforcement varies by platform:

  • Hard limit with error: Salesforce's Data Import Wizard rejects files over 50,000 rows immediately with an explicit error before processing begins.
  • File size rejection: HubSpot rejects files over 10MB at upload with a size error.
  • Timeout-based failure: Some platforms start processing and then time out mid-import, leaving a partial result with no clear error.
  • Silent truncation (rare): A few platforms import up to the limit and silently drop the rest.
❌ BROKEN — what happens when you exceed each enforcement type:

Hard limit (Salesforce Data Import Wizard):
File: 75,000 rows
Result: "Error: Your import file contains too many records. The maximum is 50,000."
Recovery: Split file, reimport. No partial records created.

File size rejection (HubSpot):
File: 15MB CSV
Result: "File size exceeds the 10MB limit. Please reduce your file size."
Recovery: Compress or split file.

Timeout failure (platform-dependent):
File: 200,000 rows, async import
Result: Import shows "Completed" but only 80,000 records appear.
Recovery: Hard to detect without a pre-import row count. Check import logs.

FIXED — batch approach that avoids all three:
Split into batches of [limit × 0.8] rows.
Import one batch, verify count, then import next.

Platform-by-Platform Deep Dive

Salesforce

Salesforce has three distinct import tools with different limits:

SALESFORCE IMPORT TOOL SELECTION:

Records to import → Choose tool:

≤50,000 records → Data Import Wizard
  - Browser-based, no install required
  - Supports: Contacts, Leads, Accounts, Opportunities (standard objects)
  - Built-in deduplication
  - ❌ Does not support custom objects

50,001–5,000,000 → Data Loader (desktop client)
  - Requires Java + installation
  - Supports: all standard + custom objects
  - Error logs: success.csv + error.csv per operation
  - SOAP API (default) or Bulk API mode

5,000,001+ → Bulk API 2.0 (via Data Loader or direct API)
  - Async processing in large batches
  - Monitoring via Setup → Bulk Data Load Jobs
  - Best for migrations

For most operations teams, Data Import Wizard handles up to 50,000 records. Anything larger needs Data Loader. The most common mistake is discovering this limit at 60,000 rows instead of planning for it.

HubSpot

HubSpot's import tool doesn't publish a hard row count limit, but enforces a 10MB file size ceiling. In practice, large imports (above 50,000–100,000 contacts depending on field count) frequently time out even when under 10MB.

Recommended maximum per batch: 50,000 rows. For files over 100,000 contacts, batch into 25,000–50,000 row segments and import sequentially. HubSpot deduplicates on email by default — batch imports are safe from duplicate creation if you wait for each batch to complete before starting the next.

Zoho CRM

Zoho enforces row limits by subscription edition:

Zoho EditionMax Records Per Import
Standard10,000
Professional20,000
Enterprise30,000
Ultimate50,000

For large migrations on Standard plans, a 100,000-record import requires 13 batches of 8,000 rows. For Professional, that's 6 batches of 16,000. Build this into your timeline — each batch needs to complete before the next starts to avoid duplicate detection interference.

Pipedrive

Pipedrive caps imports at 50,000 rows per operation for People, Organizations, and Deals. The limit applies per object type — a 50,000-row People import plus a 50,000-row Deals import are two separate operations, both within limits.

Dynamics 365

Dynamics 365's Import Wizard accepts files up to 8MB and 20,000 rows per import file. The 8MB limit is often the binding constraint — a 20,000-row file with many populated fields can exceed 8MB.

For larger migrations, Dynamics 365 supports import via the Dataverse connector and Power Platform flows, which bypass the UI-level constraints.


Batch Splitting Strategy

The 80% Rule

Split at 80% of the platform's row limit, not 100%. The buffer serves two purposes: it accounts for rows that fail validation (which still count toward the import total in some platforms), and it creates headroom for header rows, blank rows, and encoding overhead.

Platform limit → Safe batch size (80% rule):

Salesforce Wizard  50,000 → split at 40,000 rows per batch
HubSpot            ~50,000 → split at 40,000 rows per batch (watch 10MB ceiling)
Pipedrive          50,000 → split at 40,000 rows per batch
Zoho Standard      10,000 → split at 8,000 rows per batch
Zoho Professional  20,000 → split at 16,000 rows per batch
Zoho Enterprise    30,000 → split at 24,000 rows per batch
Dynamics 365       20,000 → split at 16,000 rows per batch (watch file size too)

Splitting locally

CSV Splitter splits by row count in your browser. Set batch size to 40,000 and it creates numbered batch files ready to import in sequence. No cloud upload. The file never leaves your browser.

Batch Naming and Import Log

Track every batch. This becomes essential when an import fails mid-sequence.

Batch file naming convention:
  contacts_batch_001.csv    (rows 1–40,000)
  contacts_batch_002.csv    (rows 40,001–80,000)
  contacts_batch_003.csv    (rows 80,001–120,000)

Import log (copy this, fill it in as you go):

| Batch File              | Rows  | Imported? | CRM Count After | Notes              |
|-------------------------|-------|-----------|-----------------|---------------------|
| contacts_batch_001.csv  | 40,000 | ✅ Yes   | 40,000          |                    |
| contacts_batch_002.csv  | 40,000 | ✅ Yes   | 80,000          |                    |
| contacts_batch_003.csv  | 38,451 | ❌ Failed | 80,000          | Timeout — retry    |
| contacts_batch_003.csv  | 38,451 | ✅ Yes   | 118,451         | Reduced to 20K/batch|

Rule: verify CRM count after EACH batch before importing the next.
If count doesn't match, stop and investigate before creating duplicates.

Maintaining import order

When importing in batches, sequence matters for relational data. Import parent objects before child objects: Companies before Contacts, Contacts before Deals. Within a single object type, the order between batches doesn't matter.


Common Scenarios

Large CRM migration (100,000+ records)

For migrations this size, use a staged approach: test with a 1,000-row sample first, verify field mapping and dedup behavior, then run full batches. Build in a verification step between batches — check that imported record counts match expected counts before proceeding to the next batch.

Import failing mid-batch with no clear error

This usually indicates a timeout. Reduce batch size by 50% and retry. If a 20,000-row batch is timing out, try 10,000 rows. The timeout threshold varies by CRM, time of day, and server load.

Duplicate records after multi-batch import

This happens when deduplication is disabled or when batches overlap. Check your CRM's duplicate rule configuration before starting — most CRMs deduplicate on email or a combination of name + company. If duplicates appear, run a deduplication pass after all batches complete rather than trying to prevent them during import.


Additional Resources

Official Platform Documentation:

Technical Reference:

Tested: Platform limits verified against official documentation for each platform, March 2026. Batch splitting approach validated using SplitForge CSV Splitter against files up to 500,000 rows.


FAQ

It depends on the platform's enforcement type. Salesforce's Data Import Wizard rejects the file before processing starts — no records are created. HubSpot rejects at file size threshold. Some platforms start processing and then stop mid-import, leaving a partial result. Always verify your row count before uploading to identify which enforcement type you'll hit.

No. The 50,000-row limit applies only to the browser-based Data Import Wizard. Salesforce Data Loader (desktop client) handles millions of records using the SOAP or Bulk API. For imports above 50,000 records, Data Loader is the standard tool. For 5M+ records, Bulk API 2.0 (available through Data Loader in bulk mode) is recommended.

You can run multiple sequential imports, each up to 5,000 records. For a 50,000-record import on Zoho Standard, that's 10 batches. Consider upgrading to Professional (10,000 per import) or Enterprise (20,000) if your import volume regularly exceeds 5,000 records — the time savings typically justify the cost.

Use CSV Splitter — set the split size to 40,000 rows and it creates numbered batch files (batch_001.csv through batch_005.csv for 200,000 rows) with the header row included in each file. Processing runs in your browser; the file is never uploaded anywhere.

Yes, if your CRM's import tool offers it. Batch imports can create duplicates if the same record appears in multiple batches (common when splitting by a criteria other than a unique ID). Enable deduplication on email or external ID during each batch import. If duplicates appear afterward, run a deduplication pass after all batches complete.

Mailchimp enforces a 10MB file size limit for CSV imports. For large contact lists, this can be a binding constraint before row count. A 100,000-contact file with 15 populated fields can easily exceed 10MB. Split by row count (40,000–50,000 rows per batch) and verify each batch is under 10MB before uploading.


Split Large Import Files Before They Hit the Limit

Split any CSV into batches by row count — 40K, 20K, 5K — whatever your platform requires
Header row automatically included in every batch file
Files split entirely in your browser — your CRM data never transmitted to any server
Handles files with millions of rows without crashing or uploading

Continue Reading

More guides to help you work smarter with your data

csv-import-guides

CSV Delimiter Errors: Fix Comma vs Semicolon for International Teams

Stop all data in Column A errors. Learn comma, semicolon & tab CSV delimiters plus quick fixes for global teams.

Read More
csv-guides

How to Split Large CSV Files Without Excel (Even 1M+ Rows)

Need to split a massive CSV file but Excel keeps crashing? Learn how to split files with millions of rows safely in your browser without uploads.

Read More
excel-guides

Batch Convert Multiple Excel Files to CSV Without Opening Each One

Opening 50 Excel files one at a time to save as CSV takes 45 minutes and produces inconsistent results. Three methods handle the same task in under 60 seconds — none require opening a single file.

Read More

We use analytics cookies to improve SplitForge. Your files never leave your browser — ever. Privacy policy