Quick Answer
MySQL Error 1064 during CSV imports is almost always a file-format problem, not SQL syntax. The error message says "syntax error" but the actual cause is encoding mismatch (UTF-16 instead of UTF-8), delimiter conflicts (comma in command, tab in file), quote character issues, disabled LOCAL INFILE configuration, or line terminator differences (Windows \r\n vs Unix \n).
MySQL mislabels data format issues as syntax errors because it fails to parse the file according to your specified parameters. Your LOAD DATA INFILE statement is probably correct—the CSV file format doesn't match your expectations.
The fix: Enable LOCAL INFILE on server and client, convert CSV to UTF-8 without BOM (use Google Sheets export or iconv), identify actual delimiter by opening file in text editor, match FIELDS TERMINATED BY to actual structure, specify correct line terminator for your OS (Windows: \r\n, Unix: \n), and wrap reserved words in backticks.
FAST FIX (90 Seconds)
Before diving into detailed diagnostics, check these first:
✓ Is LOCAL INFILE enabled? Run SHOW VARIABLES LIKE 'local_infile'; → if OFF, enable with SET GLOBAL local_infile = 1;
✓ Is your file UTF-8? Check with file -i yourfile.csv → if UTF-16 or Latin1, convert to UTF-8
âś“ Does delimiter match CSV? Open file in text editor, verify comma/tab/pipe/semicolon
âś“ Are fields quoted correctly? Check if data uses " or ' for text enclosure
âś“ Is LINES TERMINATED BY correct? Windows uses \r\n, Unix/Mac use \n
âś“ Does file path use forward slashes? Even on Windows: C:/path/file.csv not C:\path\file.csv
If you answered "no" or "not sure" to any of these, jump to that fix section below.
You've created your table. Your CSV data looks clean. You run LOAD DATA INFILE and MySQL immediately throws:
ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near...
No helpful context. No clear explanation. Just a cryptic message pointing to "Line 1."
TL;DR: MySQL 1064 errors during CSV imports indicate file format mismatches—not SQL syntax errors. 90% of failures stem from: (1) LOCAL INFILE disabled by default, (2) UTF-16/Latin1 encoding instead of UTF-8, (3) delimiter mismatch (comma vs tab vs semicolon), (4) line terminator differences (Windows \r\n vs Unix \n), (5) quote character conflicts, (6) reserved word column names. Fix by enabling LOCAL INFILE, converting to UTF-8 without BOM, identifying actual delimiter in text editor, specifying correct LINES TERMINATED BY, and wrapping reserved words in backticks.
MySQL reports file format problems as "syntax errors at Line 1" because it fails to parse according to your parameters. This makes debugging frustrating—you're checking SQL syntax when the real issue is CSV encoding or delimiters.
Table of Contents
- Why MySQL 1064 Errors Happen
- Fix #1: Enable LOCAL INFILE
- Fix #2: Convert to UTF-8 Encoding
- Fix #3: Match Delimiters Correctly
- Fix #4: Handle Quote Characters
- Fix #5: Fix Line Terminators
- Fix #6: Escape Reserved Words
- Complete Import Process
- Prevent Future Errors
- FAQ
Why MySQL 1064 Errors Happen During CSV Import
MySQL's LOAD DATA INFILE requires explicit specification of delimiters, quote characters, line terminators, and encoding. A mismatch in any parameter triggers Error 1064, but the error message misleadingly says "syntax error" instead of "file format mismatch."
Common causes that MySQL reports as "syntax errors":
- LOCAL INFILE disabled: Server blocks client file access for security, throws Error 1064 or 1148
- UTF-16/Latin1 encoding: MySQL expects UTF-8; wrong encoding causes parse failure
- BOM (Byte Order Mark): Windows apps add invisible
0xEF 0xBB 0xBFbytes MySQL treats as data - Delimiter mismatch: Command says comma, file uses tab—entire rows interpreted as single fields
- Quote conflicts: Data has single quotes, command specifies double quotes—unmatched quote errors
- Line terminators: Windows
\r\nvs Unix\nmismatch corrupts last field per row - Reserved words: Column names like
order,date,keyconflict with SQL keywords
According to MySQL LOAD DATA documentation, these file format issues produce identical error messages to actual SQL syntax problems, making diagnosis difficult without systematic checking.
Fix #1: Enable LOCAL INFILE
What this fixes: Error 1064 or 1148 when MySQL blocks client file access due to disabled local_infile configuration. Many MySQL installations disable this by default for security, but the error message says "syntax error" instead of "LOCAL INFILE disabled."
Check current setting:
SHOW VARIABLES LIKE 'local_infile';
If it returns OFF, LOCAL INFILE is disabled.
Enable on server (temporary):
SET GLOBAL local_infile = 1;
This enables for current session but resets after server restart.
Enable permanently (MySQL configuration):
Add to my.cnf or my.ini:
[mysqld]
local_infile = 1
Restart MySQL:
sudo systemctl restart mysql
Enable on client (command line):
mysql --local-infile=1 -u root -p
Enable in MySQL Workbench:
Edit connection → Advanced → add under "Others":
OPT_LOCAL_INFILE=1
Fix #2: Convert to UTF-8 Encoding
What this fixes: Error 1064 when MySQL can't parse UTF-16 or Latin1 encoded files expecting UTF-8. Excel's "CSV (Comma delimited)" saves as Windows-1252, not UTF-8. Only "CSV UTF-8 (Comma delimited)" produces proper UTF-8 files. BOM (Byte Order Mark) corruption also triggers syntax errors.
Detect current encoding:
file -i yourfile.csv
Output example:
yourfile.csv: text/plain; charset=utf-16le
Understanding why CSV characters break due to encoding mismatches helps identify whether your file uses UTF-8, UTF-16, Latin1, or Windows-1252—each encoding handles special characters differently and causes MySQL parse failures when expectations don't match.
Convert using Google Sheets (easiest, auto UTF-8):
- Open CSV in Google Sheets
- File → Download → Comma-separated values (.csv)
Google Sheets always exports as UTF-8 without BOM.
Convert using Excel:
- Open CSV in Excel
- File → Save As
- File Format: "CSV UTF-8 (Comma delimited)" (NOT regular "CSV (Comma delimited)")
- Save
Command line conversion (iconv):
iconv -f WINDOWS-1252 -t UTF-8 input.csv > output.csv
Remove BOM:
# Linux/Mac
sed '1s/^\xEF\xBB\xBF//' input.csv > output.csv
# Windows PowerShell
$content = Get-Content input.csv
$content | Out-File -Encoding UTF8NoBOM output.csv
Or Notepad++: Encoding → Convert to UTF-8 without BOM → Save.
For comprehensive BOM (Byte Order Mark) troubleshooting, understanding how invisible Unicode markers corrupt MySQL imports helps diagnose parse failures that appear as syntax errors despite correct SQL commands.
Fix #3: Match Delimiters Correctly
What this fixes: Error 1064 when FIELDS TERMINATED BY doesn't match actual CSV structure. Excel shows commas in UI but file uses tabs. European locales use semicolons. Opening file in text editor reveals true delimiter.
Identify the delimiter:
Open CSV in plain text editor (Notepad++, VS Code, nano):
John,Doe,[email protected],555-1234
Jane,Smith,[email protected],555-5678
Delimiter: Comma
John Doe [email protected] 555-1234
Jane Smith [email protected] 555-5678
Delimiter: Tab (shown as whitespace)
John|Doe|[email protected]|555-1234
Delimiter: Pipe
Correct LOAD DATA syntax:
For comma:
LOAD DATA LOCAL INFILE '/path/to/file.csv'
INTO TABLE users
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
For tab:
FIELDS TERMINATED BY '\t'
For semicolon (European):
FIELDS TERMINATED BY ';'
Fix #4: Handle Quote Characters Properly
What this fixes: Error 1064 when data contains quotes (" or ') that conflict with ENCLOSED BY specification. Data with single quotes (e.g., "O'Brien") breaks when command uses wrong quote character for enclosure.
Identify quote character:
Open CSV in text editor:
"John","Doe","[email protected]","555-1234"
"Jane","O'Brien","[email protected]","555-5678"
Enclosure: Double quotes (")
Standard CSV with double quotes:
LOAD DATA LOCAL INFILE '/path/to/file.csv'
INTO TABLE users
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
CSV without quotes:
Omit ENCLOSED BY entirely:
LOAD DATA LOCAL INFILE '/path/to/file.csv'
INTO TABLE users
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
Escape character for nested quotes:
Standard CSV escaping doubles the quote: "He said ""hello"""
For this format:
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
ESCAPED BY '"'
Fix #5: Fix Line Terminator Issues
What this fixes: Error 1064 when LINES TERMINATED BY doesn't match file's actual line endings. Windows files use \r\n (CRLF), Unix/Mac use \n (LF). Mismatch treats \r as part of last field, corrupting data.
Check line endings:
file yourfile.csv
Output shows:
- "with CRLF line terminators" → Windows (
\r\n) - "with LF line terminators" → Unix/Mac (
\n)
For Windows files (CRLF):
LOAD DATA LOCAL INFILE 'C:/path/to/file.csv'
INTO TABLE users
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
For Unix/Mac files (LF):
LINES TERMINATED BY '\n'
File path on Windows:
Always use forward slashes:
-- Correct
LOAD DATA LOCAL INFILE 'C:/Users/John/Documents/data.csv'
-- Wrong (syntax error)
LOAD DATA LOCAL INFILE 'C:\Users\John\Documents\data.csv'
Or double backslashes:
LOAD DATA LOCAL INFILE 'C:\\Users\\John\\Documents\\data.csv'
Fix #6: Escape Reserved Words and Specify Character Set
What this fixes: Error 1064 when CSV columns match MySQL reserved words (order, date, key, select, table). Also fixes character encoding declaration issues. Backticks force MySQL to treat reserved words as identifiers, not SQL keywords.
Wrap reserved words in backticks:
LOAD DATA LOCAL INFILE '/path/to/file.csv'
INTO TABLE orders
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(`order`, `date`, customer_name, total);
Notice order and date wrapped in backticks. Full reserved word list: MySQL Keywords.
Explicitly specify character set:
LOAD DATA LOCAL INFILE '/path/to/file.csv'
INTO TABLE users
CHARACTER SET utf8mb4
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
Use utf8mb4 (not utf8) for full Unicode support including emojis. MySQL 5.7+ is strict about character encoding—specifying the correct set prevents Error 1064 and Error 1366.
Complete MySQL CSV Import Process
Step 1: Prepare environment
SET GLOBAL local_infile = 1;
Connect with LOCAL INFILE:
mysql --local-infile=1 -u root -p your_database
Step 2: Inspect CSV
file -i data.csv # Check encoding
head -n 5 data.csv # Check delimiter and structure
Step 3: Pre-clean CSV
- Convert to UTF-8 without BOM (Google Sheets or
iconv) - Verify delimiter matches actual file
- Remove hidden characters with text editor
- Check for reserved words in column names
Step 4: Create table
CREATE TABLE users (
id INT AUTO_INCREMENT PRIMARY KEY,
first_name VARCHAR(100),
last_name VARCHAR(100),
email VARCHAR(255),
phone VARCHAR(20)
) CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;
Step 5: Import with correct parameters
LOAD DATA LOCAL INFILE '/path/to/data.csv'
INTO TABLE users
CHARACTER SET utf8mb4
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(first_name, last_name, email, phone);
Step 6: Verify import
SELECT COUNT(*) FROM users;
SELECT * FROM users LIMIT 10;
SHOW WARNINGS;
Step 7: Handle duplicates
For duplicate key errors, use REPLACE or IGNORE:
LOAD DATA LOCAL INFILE '/path/to/data.csv'
REPLACE INTO TABLE users
-- rest of command
Prevent Future Import Errors
Create an import template
Save your successful LOAD DATA command:
-- save to import_template.sql
LOAD DATA LOCAL INFILE '/path/to/file.csv'
INTO TABLE your_table
CHARACTER SET utf8mb4
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
Update file path and table name for each use.
Standardize export process
Excel: Always use "CSV UTF-8 (Comma delimited)" Google Sheets: File → Download → CSV (automatically UTF-8)
Test with small samples
Before importing millions of rows:
head -n 101 large_file.csv > test_sample.csv
Import sample, verify results, then import full file.
Use mysqlimport for batch imports
mysqlimport internally uses LOAD DATA INFILE with simpler syntax:
mysqlimport --local \
--fields-terminated-by=',' \
--fields-enclosed-by='"' \
--lines-terminated-by='\n' \
--ignore-lines=1 \
your_database /path/to/users.csv
Requirements: File name must match table name (users.csv → users table).
When errors persist despite following MySQL syntax exactly, applying general CSV import error fixes in 5 minutes helps identify whether the issue stems from file corruption, hidden characters, or platform-specific formatting beyond encoding and delimiters.
Privacy-First CSV Processing
Your database CSVs contain sensitive information: customer data, transaction records, employee information, financial details. Every upload to online "CSV fixer" tools creates compliance risks—GDPR violations, HIPAA issues, data breach liability.
Better approach: Clean CSVs locally without uploads.
Command-line tools:
iconvfor encoding conversiondos2unix/unix2dosfor line ending fixessedfor BOM removal- Text editors (Notepad++, VS Code) for UTF-8 conversion
Desktop GUI:
- Excel ("CSV UTF-8" save option)
- Google Sheets (automatic UTF-8 export)
- LibreOffice Calc (encoding selection)
Browser-based local processing:
For teams needing automation, browser-based tools process CSVs entirely client-side. Data never leaves your computer—JavaScript runs locally with no server uploads. Tools like Data Cleaner handle encoding conversion, delimiter fixes, BOM removal, and structure validation—all locally, processing files up to 10M+ rows without external servers.
Essential for regulated industries (finance, healthcare, government) handling sensitive database content.
FAQ
The Bottom Line
MySQL 1064 errors during CSV imports indicate file format mismatches, not SQL syntax problems. In 90% of cases, the issue is:
- LOCAL INFILE disabled (most common)
- Encoding mismatch (UTF-16 or Latin1, not UTF-8)
- Delimiter doesn't match actual CSV structure
- Line terminator wrong (Windows
\r\nvs Unix\n) - Quote character mismatch
- Reserved word column names
Your next import workflow:
- Enable LOCAL INFILE on server and client
- Convert CSV to UTF-8 without BOM (Google Sheets or
iconv) - Identify actual delimiter in text editor (comma/tab/semicolon)
- Match
FIELDS TERMINATED BYto file structure - Use
CHARACTER SET utf8mb4in LOAD DATA command - Wrap reserved words in backticks
- Test with 100-row sample before full import
Prepare your CSV properly once—not debug for three hours every import. Use local tools (command-line, text editors, or browser processors) to clean encoding, standardize delimiters, and validate structure without uploading sensitive database content to external servers.
For teams handling transaction data, customer records, or financial information, local CSV processing isn't optional—it's required.
Tools Referenced:
Official Documentation:
- MySQL LOAD DATA INFILE - Official syntax reference
- MySQL Reserved Words - Keyword list
- UTF-8 Everywhere - Character encoding manifesto
- GNU Coreutils file - File command docs
Local Processing Tools:
- Data Cleaner - Encoding conversion, delimiter fixes, validation
All browser-based tools process data entirely in your browser—no uploads, no servers, no data leaving your computer. Essential for protecting database content containing customer PII, transaction details, or confidential business information.
Managing database imports? Connect on LinkedIn or share your workflow at @splitforge.