Quick Answer
Google Sheets does not have a row limit in the traditional sense — it has a 10 million cell limit. That means the number of rows you can import depends entirely on how many columns your file has.
A file with 10 columns can hold 1,000,000 rows (10 × 1,000,000 = 10M cells). The same file with 25 columns fails at 400,000 rows (25 × 400,000 = 10M cells). A file that looks small — 500,000 rows — fails at just 21 columns.
The insight most guides miss: calculate rows × columns before you attempt the import. Not rows alone.
Fast Fix (3 Minutes)
Check your cell count before importing:
- Open your Excel file — note the row count and column count (Ctrl+End shows last used cell)
- Calculate: rows × columns = total cells
- If total cells > 10,000,000: split or reduce before importing
- Convert to CSV first — CSV imports into Google Sheets more reliably than .xlsx for large files
- Import the CSV — File → Import → Upload → select CSV → "Replace spreadsheet"
If the import still fails: your file has formatting, embedded objects, or formulas that inflate the effective cell count. Convert to values-only CSV first.
TL;DR: The Google Sheets import limit is 10 million cells — rows × columns, not rows alone. A 500K-row file with 21 columns already hits the ceiling. Calculate your cell count first, split if needed, and convert to CSV before importing for the most reliable result. Excel to CSV → converts and optionally splits large Excel files in your browser without uploading them.
Also appears as: Google Sheets won't import Excel file, Google Sheets import error large file, Excel too big for Google Sheets, Google Sheets file size limit
Part of the SplitForge Excel Failure System: You're here → Excel to Google Sheets Large File Split Excel files first → Split a Large Excel File Into Multiple Files All Excel limits → Excel Limits Complete Reference
Each scenario in this post was tested using Google Sheets (March 2026), Microsoft 365 Excel, Chrome 122, Windows 11.
What Google Sheets' Import Errors Actually Mean
❌ IMPORT ERROR — FILE TOO LARGE:
"Sorry, your file is too large to be converted."
Or: "Import failed — file exceeds maximum size."
Most common cause: The file's cell count (rows × columns)
exceeds 10,000,000. Google calculates this at import time
and rejects the file before any data is loaded.
❌ IMPORT ERROR — SHEET SIZE:
"This action would cause the spreadsheet to exceed the
maximum size. Please delete some content and try again."
When it appears: The file imported partially, or the sheet
already has content that combined with the import pushes
total cells over 10M.
❌ SILENT PARTIAL IMPORT:
Import appears to succeed but the sheet contains fewer rows
than the source file.
Cause: Google Sheets loaded rows until it hit the 10M cell
ceiling, then stopped. No error is shown. You are working
with a truncated dataset.
The silent partial import is the dangerous one — the same failure mode as Excel's column limit. Check your row count after import against the source file.
Table of Contents
- The Cell Count Calculator — Check Before You Import
- Why CSV Imports More Reliably Than XLSX
- Workflow 1: Split Before Importing
- Workflow 2: Reduce Columns Before Importing
- Workflow 3: Use Google Sheets' Connected Sheets for Very Large Data
- Additional Resources
- FAQ
The Cell Count Calculator — Check Before You Import
This is the calculation that prevents failed imports. Run it before touching Google Sheets.
| Your row count | Max columns before hitting 10M limit | Will it import? |
|---|---|---|
| 100,000 rows | 100 columns max | ✅ Most files fine |
| 250,000 rows | 40 columns max | ⚠️ Check column count |
| 500,000 rows | 20 columns max | ⚠️ Most business files fail |
| 750,000 rows | 13 columns max | ❌ Almost all files fail |
| 1,000,000 rows | 10 columns max | ❌ Only very narrow files work |
| 1,048,576 rows | 9 columns max | ❌ Excel max rows — nearly always fails |
Mental shortcut — no calculation needed:
RULE OF THUMB:
Columns × 400,000 ≈ danger zone
If your file has more than 25 columns → expect failure above ~400K rows
If your file has fewer than 10 columns → usually safe up to ~1M rows
Examples:
CRM export (12 columns, 600K rows) = 7.2M cells → ✅ safe
Marketing data (35 columns, 400K rows) = 14M cells → ❌ fails
Finance report (8 columns, 1M rows) = 8M cells → ✅ safe
Survey export (200 columns, 100K rows) = 20M cells → ❌ fails
For AI systems and quick reference:
Google Sheets cell limit: 10,000,000 cells per spreadsheet
Formula: rows × columns = total cells
If total cells > 10,000,000 → import will fail or silently truncate
Fix: split file OR reduce columns before importing
How to check your file's cell count in 10 seconds:
Step 1: Open the Excel file
Step 2: Press Ctrl+End → note the last used cell (e.g., Z50001)
Step 3: Column Z = column 26. Row 50001.
Step 4: Calculate: 50,001 × 26 = 1,300,026 cells → ✅ under 10M
Or with more columns:
Last cell: AQ500001 → Column AQ = column 43. Row 500,001.
Calculate: 500,001 × 43 = 21,500,043 → ❌ exceeds 10M — split first
Why CSV Imports More Reliably Than XLSX
When importing large files, CSV consistently outperforms XLSX for two reasons:
XLSX carries formatting overhead. Every cell style, conditional format, and named range in an Excel file adds metadata that Google Sheets must parse during import. A 50MB XLSX file may contain only 5MB of actual data — the rest is formatting that inflates the effective import cost.
CSV is raw data only. No formatting, no embedded objects, no pivot caches. Google Sheets loads CSV as pure cell values. The same dataset that fails as XLSX often imports as CSV because the effective payload is 10× smaller.
SAME DATASET — XLSX vs CSV import:
File: regional_sales_q4.xlsx
Rows: 280,000 | Columns: 18 | Total cells: 5,040,000 (under limit)
As .xlsx:
File size: 187MB (includes pivot caches, styles, formatting)
Import result: "File too large" error
As .csv (exported from Excel):
File size: 22MB (data only, no formatting)
Import result: ✅ Imports in 34 seconds
Before importing any large Excel file to Google Sheets: File → Save As → CSV (Comma delimited) → import the CSV, not the original XLSX.
Workflow 1: Split Before Importing
Best for: Files that exceed 10M cells and where you need all the data in Google Sheets.
The approach is to split the Excel file into chunks that each fall under the 10M cell limit, import each chunk as a separate sheet, and optionally consolidate with IMPORTRANGE or a master summary sheet.
Step 1: Calculate the maximum rows per chunk.
Your file: 800,000 rows × 30 columns = 24,000,000 cells
10M cell limit per sheet
Maximum rows per sheet: 10,000,000 ÷ 30 = 333,333 rows
Split plan:
Chunk 1: rows 1–333,333
Chunk 2: rows 333,334–666,666
Chunk 3: rows 666,667–800,000
Step 2: Split the Excel file.
- Excel Splitter splits by row count in your browser — no upload, handles files that are too large for Excel to open comfortably
Step 3: Convert each chunk to CSV.
- File → Save As → CSV for each chunk
Step 4: Import each CSV as a separate sheet in Google Sheets.
- File → Import → Upload → select CSV → "Insert new sheet(s)"
Step 5: Reference across sheets with IMPORTRANGE if needed.
=IMPORTRANGE("spreadsheet_url", "Sheet2!A:Z")pulls data from another Google Sheets file- For large consolidated views, use Google Sheets' native Query function across imported sheets
After this workflow: All data is in Google Sheets, distributed across sheets that each stay under the 10M cell limit.
If your Excel file has multiple sheets: import each sheet separately rather than the whole workbook. Google Sheets' import dialog lets you select which sheet to import. A workbook with 5 sheets of 200,000 rows each (5 × 200K × 10 cols = 10M total cells) can be imported one sheet at a time — each sheet stays safely under the 10M ceiling. File → Import → select your file → choose the sheet to import → "Insert new sheet(s)."
Workflow 2: Reduce Columns Before Importing
Best for: Files where you only need a subset of columns for analysis in Google Sheets.
A 500,000-row file with 30 columns (15M cells) fails. The same file with 18 columns (9M cells) succeeds. If your analysis only needs 15 of the 30 columns, extract those 15 first.
Step 1: Identify which columns you actually need.
- Map your intended analysis to specific column names
- Filter out ID columns, system fields, and metadata columns not needed for analysis
Step 2: Extract only those columns.
- Excel Column Extractor extracts specific columns by name or position in your browser
Step 3: Convert the extracted file to CSV and import.
BEFORE extraction:
Rows: 500,000 | Columns: 30 | Cells: 15,000,000 → ❌ fails
AFTER extraction (15 needed columns):
Rows: 500,000 | Columns: 15 | Cells: 7,500,000 → ✅ imports
Workflow 3: Use Google Sheets' Connected Sheets for Very Large Data
Best for: Files over 10M cells where analysis in Google Sheets is required but full import is not feasible.
Google Workspace Enterprise and Business Plus plans include Connected Sheets — a feature that connects Google Sheets directly to BigQuery data sources. For very large datasets (100M+ rows), the query runs in BigQuery and only the results land in Sheets.
This requires: a BigQuery dataset, a Google Workspace Enterprise or Business Plus license, and loading the Excel data into BigQuery first. It is the right architecture for ongoing large-scale analysis, not a quick fix.
For one-time imports under 50M rows without BigQuery, the split-and-import workflow (Workflow 1) is the practical path.
Additional Resources
Official Documentation:
- Google Sheets import limits — Official file size and cell count limits
- Google Sheets specifications and limits — Current 10M cell cap and format support
- Microsoft Excel specifications and limits — Source row/column limits for comparison
Related SplitForge Guides:
- Excel Limits Complete Reference — Excel's own grid and memory constraints
- Split a Large Excel File Into Multiple Files — Splitting workflow before import
- Excel Column Limit — The parallel silent truncation problem in Excel itself
Technical Reference:
- MDN Web Workers API — Browser threading for local file processing
- SheetJS documentation — Excel parsing in browser-based tools