Excel to CSV at 100,000 Rows/Second.
1 Million Rows in 10.1 Seconds.
Browser-based conversion with zero round-trips to a server. Tested on Chrome 120 (stable at time of testing), Apple M1 Pro, 16GB RAM — February 2026. Results vary ±15–20% by hardware and file complexity.
Verified Benchmark Results
Chrome 120 (stable at time of testing, February 2026) · Apple M1 Pro, 16 GB unified memory · February 2026 · 5 runs per scenario — highest and lowest discarded — median reported
| Dataset Size | Processing Time | Speed (rows/sec) | Throughput | File Size | Typical Use Case |
|---|---|---|---|---|---|
| 500 rows | <1s | 31,250 | ~0.04 MB | Small department exports, report snapshots | |
| 10K rows | <1s | 93,458 | ~0.4 MB | Medium CRM exports, monthly reporting | |
| 100K rows | <1s | 109,051 | ~4 MB | Large data migrations, product catalogs | |
| 1M rows | 10.1s | 99,010 | ~40 MB | Enterprise exports, transaction histories |
Performance Visualized
Rows/Second by Dataset Size
Conversion Time: SplitForge vs Excel (seconds)
- Excel Save As CSV (not benchmarked at 1M rows — degrades severely above 500K)
- SplitForge
Excel time = Save As CSV, including auto-format processing. At 1M rows, Excel performance degrades severely; files above 1,048,576 rows cannot be opened in Excel.
Test Methodology
Full Test Configuration
All benchmarks were run using the standard SplitForge test protocol — 5 runs per scenario with the highest and lowest discarded, median reported. This removes outliers caused by garbage collection pauses and OS-level scheduling events.
We used a synthetic mixed-type dataset generated with a fixed random seed (reproducible schema above). You can reproduce any row count by generating a CSV with the same schema and converting it through the tool — processing time is displayed in the result panel after each conversion.
What Timing Measures
Reproducing These Results
What Affects Performance in Your Environment
Benchmarks are medians from a specific machine. Here's what shifts them.
Available RAM
↑ ImpactMost significant factor. JavaScript heap is limited by available RAM. Processing 1M rows allocates ~400–600MB. Machines with <8GB RAM usable by Chrome may hit GC pauses that 2–3× conversion time. 16GB+ RAM strongly recommended for 1M+ row files.
File Data Complexity
↑ ImpactDate cells require ISO conversion per cell. Large number cells (15+ digits) require precision handling. Excel error cells (#REF!, #VALUE!) trigger extra logic. A file with 100% date columns runs 30–40% slower than 100% text columns.
Browser Version
↑ ImpactChrome V8 and Firefox SpiderMonkey have different Web Worker performance characteristics. Chrome 120 (the version used in our February 2026 benchmarks) runs these workloads fastest in our testing. Safari is 15–25% slower on large XLSX files due to WebKit's WASM and TypedArray handling.
Number of Active Browser Tabs
↑ ImpactChrome throttles background worker memory when many tabs are open. For best performance on 500K+ row files, close unneeded tabs before starting conversion to give the Web Worker more memory headroom.
Force Text Mode
↓ ImpactAdds ~5–8% overhead per cell compared to raw numeric output. Worth it: without it, data corruption occurs silently. Not disabling Force Text Mode is recommended for any data containing IDs, phone numbers, or leading zeros.
Multi-Sheet ZIP vs Single Sheet
↓ ImpactMulti-sheet ZIP exports add JSZip compression time at the end (~2–5s per export). Each sheet is processed serially with memory cleanup between sheets — total time scales linearly with sheet count, not exponentially.
Why Client-Side Processing is Faster (Not Just Safer)
Most people assume server-based tools are faster. They're not — for file conversion workloads. When you upload to a cloud converter, the bottleneck is the upload, not the compute. A 40MB Excel file on a 50 Mbps connection takes 6–7 seconds just to upload before conversion starts. Then you wait for server processing, queue position, and download.
Cloud Converter — 40MB Excel file
SplitForge — 40MB Excel file (1M rows)
Time Saved Calculator
Baseline: ~15 minutes per file fixing Excel corruption errors (leading zeros re-added, scientific notation corrected, dates fixed). SplitForge: ~10 seconds per file — zero post-conversion fixes because Force Text Mode prevents corruption from occurring.
Performance Limitations: When Benchmarks Won't Match
These benchmarks reflect ideal conditions. Real-world performance varies based on the following factors.
JavaScript heap pressure on 1M+ row files causes GC pauses that significantly extend conversion time. Real-world performance may be 2–3× slower than benchmarks.
Date parsing and large-number precision handling add ~30–40% overhead per cell compared to plain text columns.
Browser memory ceiling. The 2GB limit is a hard constraint on current Web Worker memory allocation.
Safari's WebKit engine runs these SheetJS + PapaParse workloads 15–25% slower than Chrome V8. Benchmarks were measured on Chrome.
Technical Questions
Why are the speeds different across dataset sizes?
How does performance scale with file size vs row count?
Can it really handle more rows than Excel?
How does M1 Mac performance compare to Windows/Intel?
Does performance degrade with complex data (formulas, dates, special chars)?
What happens to performance with multi-sheet ZIP exports?
How does performance compare to CloudConvert, Zamzar, or Convertio?
How does SplitForge compare to Python pandas for performance?
What hardware do you recommend for 1M+ row conversions?
Related Tools & Guides
Ready to Convert at This Speed?
1 million rows in 10 seconds. Force Text Mode on by default. File contents never uploaded.
Also try: Excel Splitter · Excel Cleaner · Data Cleaner