Back to Blog
csv-troubleshooting

MySQL CSV Import: Fix '1064 Syntax Error at Line 1' (2026 Guide)

January 16, 2026
15
By SplitForge Team

Quick Answer

MySQL Error 1064 during CSV imports is almost always a file-format problem, not SQL syntax. The error message says "syntax error" but the actual cause is encoding mismatch (UTF-16 instead of UTF-8), delimiter conflicts (comma in command, tab in file), quote character issues, disabled LOCAL INFILE configuration, or line terminator differences (Windows \r\n vs Unix \n).

MySQL mislabels data format issues as syntax errors because it fails to parse the file according to your specified parameters. Your LOAD DATA INFILE statement is probably correct—the CSV file format doesn't match your expectations.

The fix: Enable LOCAL INFILE on server and client, convert CSV to UTF-8 without BOM (use Google Sheets export or iconv), identify actual delimiter by opening file in text editor, match FIELDS TERMINATED BY to actual structure, specify correct line terminator for your OS (Windows: \r\n, Unix: \n), and wrap reserved words in backticks.


FAST FIX (90 Seconds)

Before diving into detailed diagnostics, check these first:

✓ Is LOCAL INFILE enabled? Run SHOW VARIABLES LIKE 'local_infile'; → if OFF, enable with SET GLOBAL local_infile = 1; ✓ Is your file UTF-8? Check with file -i yourfile.csv → if UTF-16 or Latin1, convert to UTF-8 ✓ Does delimiter match CSV? Open file in text editor, verify comma/tab/pipe/semicolon ✓ Are fields quoted correctly? Check if data uses " or ' for text enclosure ✓ Is LINES TERMINATED BY correct? Windows uses \r\n, Unix/Mac use \n ✓ Does file path use forward slashes? Even on Windows: C:/path/file.csv not C:\path\file.csv

If you answered "no" or "not sure" to any of these, jump to that fix section below.


You've created your table. Your CSV data looks clean. You run LOAD DATA INFILE and MySQL immediately throws:

ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near...

No helpful context. No clear explanation. Just a cryptic message pointing to "Line 1."

TL;DR: MySQL 1064 errors during CSV imports indicate file format mismatches—not SQL syntax errors. 90% of failures stem from: (1) LOCAL INFILE disabled by default, (2) UTF-16/Latin1 encoding instead of UTF-8, (3) delimiter mismatch (comma vs tab vs semicolon), (4) line terminator differences (Windows \r\n vs Unix \n), (5) quote character conflicts, (6) reserved word column names. Fix by enabling LOCAL INFILE, converting to UTF-8 without BOM, identifying actual delimiter in text editor, specifying correct LINES TERMINATED BY, and wrapping reserved words in backticks.

MySQL reports file format problems as "syntax errors at Line 1" because it fails to parse according to your parameters. This makes debugging frustrating—you're checking SQL syntax when the real issue is CSV encoding or delimiters.


Table of Contents

  1. Why MySQL 1064 Errors Happen
  2. Fix #1: Enable LOCAL INFILE
  3. Fix #2: Convert to UTF-8 Encoding
  4. Fix #3: Match Delimiters Correctly
  5. Fix #4: Handle Quote Characters
  6. Fix #5: Fix Line Terminators
  7. Fix #6: Escape Reserved Words
  8. Complete Import Process
  9. Prevent Future Errors
  10. FAQ

Why MySQL 1064 Errors Happen During CSV Import

MySQL's LOAD DATA INFILE requires explicit specification of delimiters, quote characters, line terminators, and encoding. A mismatch in any parameter triggers Error 1064, but the error message misleadingly says "syntax error" instead of "file format mismatch."

Common causes that MySQL reports as "syntax errors":

  • LOCAL INFILE disabled: Server blocks client file access for security, throws Error 1064 or 1148
  • UTF-16/Latin1 encoding: MySQL expects UTF-8; wrong encoding causes parse failure
  • BOM (Byte Order Mark): Windows apps add invisible 0xEF 0xBB 0xBF bytes MySQL treats as data
  • Delimiter mismatch: Command says comma, file uses tab—entire rows interpreted as single fields
  • Quote conflicts: Data has single quotes, command specifies double quotes—unmatched quote errors
  • Line terminators: Windows \r\n vs Unix \n mismatch corrupts last field per row
  • Reserved words: Column names like order, date, key conflict with SQL keywords

According to MySQL LOAD DATA documentation, these file format issues produce identical error messages to actual SQL syntax problems, making diagnosis difficult without systematic checking.


Fix #1: Enable LOCAL INFILE

What this fixes: Error 1064 or 1148 when MySQL blocks client file access due to disabled local_infile configuration. Many MySQL installations disable this by default for security, but the error message says "syntax error" instead of "LOCAL INFILE disabled."

Check current setting:

SHOW VARIABLES LIKE 'local_infile';

If it returns OFF, LOCAL INFILE is disabled.

Enable on server (temporary):

SET GLOBAL local_infile = 1;

This enables for current session but resets after server restart.

Enable permanently (MySQL configuration):

Add to my.cnf or my.ini:

[mysqld]
local_infile = 1

Restart MySQL:

sudo systemctl restart mysql

Enable on client (command line):

mysql --local-infile=1 -u root -p

Enable in MySQL Workbench:

Edit connection → Advanced → add under "Others":

OPT_LOCAL_INFILE=1

Fix #2: Convert to UTF-8 Encoding

What this fixes: Error 1064 when MySQL can't parse UTF-16 or Latin1 encoded files expecting UTF-8. Excel's "CSV (Comma delimited)" saves as Windows-1252, not UTF-8. Only "CSV UTF-8 (Comma delimited)" produces proper UTF-8 files. BOM (Byte Order Mark) corruption also triggers syntax errors.

Detect current encoding:

file -i yourfile.csv

Output example:

yourfile.csv: text/plain; charset=utf-16le

Understanding why CSV characters break due to encoding mismatches helps identify whether your file uses UTF-8, UTF-16, Latin1, or Windows-1252—each encoding handles special characters differently and causes MySQL parse failures when expectations don't match.

Convert using Google Sheets (easiest, auto UTF-8):

  1. Open CSV in Google Sheets
  2. File → Download → Comma-separated values (.csv)

Google Sheets always exports as UTF-8 without BOM.

Convert using Excel:

  1. Open CSV in Excel
  2. File → Save As
  3. File Format: "CSV UTF-8 (Comma delimited)" (NOT regular "CSV (Comma delimited)")
  4. Save

Command line conversion (iconv):

iconv -f WINDOWS-1252 -t UTF-8 input.csv > output.csv

Remove BOM:

# Linux/Mac
sed '1s/^\xEF\xBB\xBF//' input.csv > output.csv

# Windows PowerShell
$content = Get-Content input.csv
$content | Out-File -Encoding UTF8NoBOM output.csv

Or Notepad++: Encoding → Convert to UTF-8 without BOM → Save.

For comprehensive BOM (Byte Order Mark) troubleshooting, understanding how invisible Unicode markers corrupt MySQL imports helps diagnose parse failures that appear as syntax errors despite correct SQL commands.


Fix #3: Match Delimiters Correctly

What this fixes: Error 1064 when FIELDS TERMINATED BY doesn't match actual CSV structure. Excel shows commas in UI but file uses tabs. European locales use semicolons. Opening file in text editor reveals true delimiter.

Identify the delimiter:

Open CSV in plain text editor (Notepad++, VS Code, nano):

John,Doe,[email protected],555-1234
Jane,Smith,[email protected],555-5678

Delimiter: Comma

John	Doe	[email protected]	555-1234
Jane	Smith	[email protected]	555-5678

Delimiter: Tab (shown as whitespace)

John|Doe|[email protected]|555-1234

Delimiter: Pipe

Correct LOAD DATA syntax:

For comma:

LOAD DATA LOCAL INFILE '/path/to/file.csv'
INTO TABLE users
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES;

For tab:

FIELDS TERMINATED BY '\t'

For semicolon (European):

FIELDS TERMINATED BY ';'

Fix #4: Handle Quote Characters Properly

What this fixes: Error 1064 when data contains quotes (" or ') that conflict with ENCLOSED BY specification. Data with single quotes (e.g., "O'Brien") breaks when command uses wrong quote character for enclosure.

Identify quote character:

Open CSV in text editor:

"John","Doe","[email protected]","555-1234"
"Jane","O'Brien","[email protected]","555-5678"

Enclosure: Double quotes (")

Standard CSV with double quotes:

LOAD DATA LOCAL INFILE '/path/to/file.csv'
INTO TABLE users
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;

CSV without quotes:

Omit ENCLOSED BY entirely:

LOAD DATA LOCAL INFILE '/path/to/file.csv'
INTO TABLE users
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES;

Escape character for nested quotes:

Standard CSV escaping doubles the quote: "He said ""hello"""

For this format:

FIELDS TERMINATED BY ','
ENCLOSED BY '"'
ESCAPED BY '"'

Fix #5: Fix Line Terminator Issues

What this fixes: Error 1064 when LINES TERMINATED BY doesn't match file's actual line endings. Windows files use \r\n (CRLF), Unix/Mac use \n (LF). Mismatch treats \r as part of last field, corrupting data.

Check line endings:

file yourfile.csv

Output shows:

  • "with CRLF line terminators" → Windows (\r\n)
  • "with LF line terminators" → Unix/Mac (\n)

For Windows files (CRLF):

LOAD DATA LOCAL INFILE 'C:/path/to/file.csv'
INTO TABLE users
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;

For Unix/Mac files (LF):

LINES TERMINATED BY '\n'

File path on Windows:

Always use forward slashes:

-- Correct
LOAD DATA LOCAL INFILE 'C:/Users/John/Documents/data.csv'

-- Wrong (syntax error)
LOAD DATA LOCAL INFILE 'C:\Users\John\Documents\data.csv'

Or double backslashes:

LOAD DATA LOCAL INFILE 'C:\\Users\\John\\Documents\\data.csv'

Fix #6: Escape Reserved Words and Specify Character Set

What this fixes: Error 1064 when CSV columns match MySQL reserved words (order, date, key, select, table). Also fixes character encoding declaration issues. Backticks force MySQL to treat reserved words as identifiers, not SQL keywords.

Wrap reserved words in backticks:

LOAD DATA LOCAL INFILE '/path/to/file.csv'
INTO TABLE orders
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(`order`, `date`, customer_name, total);

Notice order and date wrapped in backticks. Full reserved word list: MySQL Keywords.

Explicitly specify character set:

LOAD DATA LOCAL INFILE '/path/to/file.csv'
INTO TABLE users
CHARACTER SET utf8mb4
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES;

Use utf8mb4 (not utf8) for full Unicode support including emojis. MySQL 5.7+ is strict about character encoding—specifying the correct set prevents Error 1064 and Error 1366.


Complete MySQL CSV Import Process

Step 1: Prepare environment

SET GLOBAL local_infile = 1;

Connect with LOCAL INFILE:

mysql --local-infile=1 -u root -p your_database

Step 2: Inspect CSV

file -i data.csv  # Check encoding
head -n 5 data.csv  # Check delimiter and structure

Step 3: Pre-clean CSV

  • Convert to UTF-8 without BOM (Google Sheets or iconv)
  • Verify delimiter matches actual file
  • Remove hidden characters with text editor
  • Check for reserved words in column names

Step 4: Create table

CREATE TABLE users (
  id INT AUTO_INCREMENT PRIMARY KEY,
  first_name VARCHAR(100),
  last_name VARCHAR(100),
  email VARCHAR(255),
  phone VARCHAR(20)
) CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;

Step 5: Import with correct parameters

LOAD DATA LOCAL INFILE '/path/to/data.csv'
INTO TABLE users
CHARACTER SET utf8mb4
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(first_name, last_name, email, phone);

Step 6: Verify import

SELECT COUNT(*) FROM users;
SELECT * FROM users LIMIT 10;
SHOW WARNINGS;

Step 7: Handle duplicates

For duplicate key errors, use REPLACE or IGNORE:

LOAD DATA LOCAL INFILE '/path/to/data.csv'
REPLACE INTO TABLE users
-- rest of command

Prevent Future Import Errors

Create an import template

Save your successful LOAD DATA command:

-- save to import_template.sql
LOAD DATA LOCAL INFILE '/path/to/file.csv'
INTO TABLE your_table
CHARACTER SET utf8mb4
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;

Update file path and table name for each use.

Standardize export process

Excel: Always use "CSV UTF-8 (Comma delimited)" Google Sheets: File → Download → CSV (automatically UTF-8)

Test with small samples

Before importing millions of rows:

head -n 101 large_file.csv > test_sample.csv

Import sample, verify results, then import full file.

Use mysqlimport for batch imports

mysqlimport internally uses LOAD DATA INFILE with simpler syntax:

mysqlimport --local \
  --fields-terminated-by=',' \
  --fields-enclosed-by='"' \
  --lines-terminated-by='\n' \
  --ignore-lines=1 \
  your_database /path/to/users.csv

Requirements: File name must match table name (users.csv → users table).

When errors persist despite following MySQL syntax exactly, applying general CSV import error fixes in 5 minutes helps identify whether the issue stems from file corruption, hidden characters, or platform-specific formatting beyond encoding and delimiters.


Privacy-First CSV Processing

Your database CSVs contain sensitive information: customer data, transaction records, employee information, financial details. Every upload to online "CSV fixer" tools creates compliance risks—GDPR violations, HIPAA issues, data breach liability.

Better approach: Clean CSVs locally without uploads.

Command-line tools:

  • iconv for encoding conversion
  • dos2unix / unix2dos for line ending fixes
  • sed for BOM removal
  • Text editors (Notepad++, VS Code) for UTF-8 conversion

Desktop GUI:

  • Excel ("CSV UTF-8" save option)
  • Google Sheets (automatic UTF-8 export)
  • LibreOffice Calc (encoding selection)

Browser-based local processing:

For teams needing automation, browser-based tools process CSVs entirely client-side. Data never leaves your computer—JavaScript runs locally with no server uploads. Tools like Data Cleaner handle encoding conversion, delimiter fixes, BOM removal, and structure validation—all locally, processing files up to 10M+ rows without external servers.

Essential for regulated industries (finance, healthcare, government) handling sensitive database content.


FAQ

MySQL Error 1064 almost never indicates actual SQL syntax problems during LOAD DATA INFILE—it's MySQL's misleading way of reporting file format mismatches. Your SQL statement is probably correct, but the CSV file encoding (UTF-16 instead of UTF-8), delimiter (tab instead of comma), or line terminators (Windows CRLF vs Unix LF) don't match your command parameters. Check file encoding with file -i yourfile.csv before debugging SQL syntax.

Error 1064 indicates "syntax error" which during CSV imports usually means file format mismatch (encoding, delimiter, quotes). Error 1148 specifically means "The used command is not allowed with this MySQL version" which indicates LOCAL INFILE is disabled on the server. Enable with SET GLOBAL local_infile = 1; then reconnect with mysql --local-infile=1.

Run SHOW VARIABLES LIKE 'local_infile'; in MySQL command line. If it returns OFF, you need to enable it: SET GLOBAL local_infile = 1; on the server, then connect with mysql --local-infile=1 -u username -p. Without LOCAL INFILE enabled, LOAD DATA throws Error 1148 or 1064 even with perfect SQL syntax.

AWS RDS, Azure Database for MySQL, and Google Cloud SQL often restrict or disable LOCAL INFILE for security. Check provider documentation—many offer alternative bulk import methods (RDS Load from S3, Azure import/export service). For providers that support it, you must enable in database parameters and use LOAD DATA LOCAL INFILE (with LOCAL keyword).

phpMyAdmin uses different import mechanisms—it reads files via PHP and executes INSERT statements, bypassing LOAD DATA INFILE entirely. Command-line LOAD DATA requires LOCAL INFILE enabled and exact file format specification. phpMyAdmin is more forgiving of encoding/delimiter issues because it processes files differently. For production imports exceeding 100K rows, command-line LOAD DATA is significantly faster.

You can't mix encodings in a single LOAD DATA INFILE command—all files must use the same encoding. Standardize files first: convert all to UTF-8 using iconv -f SOURCE_ENCODING -t UTF-8 input.csv > output.csv. For batch imports with mixed encodings, write a script that detects encoding per file (file -i), converts to UTF-8, then imports.

Create a test table with identical structure: CREATE TABLE users_test LIKE users; Import to test table first: LOAD DATA LOCAL INFILE 'data.csv' INTO TABLE users_test ... Verify: SELECT COUNT(*) FROM users_test; SELECT * FROM users_test LIMIT 100; If correct, drop test table and import to production. Or use transactions with START TRANSACTION; LOAD DATA, verify, then COMMIT; or ROLLBACK;.

LOAD DATA INFILE (SQL command): More flexible, supports inline transformations, works in any MySQL client. mysqlimport (command-line utility): Simpler syntax for basic imports, but requires file name match table name (users.csv → users table). For simple bulk imports, mysqlimport is easier. For complex transformations, use LOAD DATA INFILE directly. Both use identical underlying mechanisms—mysqlimport calls LOAD DATA internally.

Dealing with other CSV import errors? See our complete guide: CSV Import Errors: Every Cause, Every Fix (2026)


Privacy-first CSV processing. Your database content stays secure.


The Bottom Line

MySQL 1064 errors during CSV imports indicate file format mismatches, not SQL syntax problems. In 90% of cases, the issue is:

  1. LOCAL INFILE disabled (most common)
  2. Encoding mismatch (UTF-16 or Latin1, not UTF-8)
  3. Delimiter doesn't match actual CSV structure
  4. Line terminator wrong (Windows \r\n vs Unix \n)
  5. Quote character mismatch
  6. Reserved word column names

Your next import workflow:

  1. Enable LOCAL INFILE on server and client
  2. Convert CSV to UTF-8 without BOM (Google Sheets or iconv)
  3. Identify actual delimiter in text editor (comma/tab/semicolon)
  4. Match FIELDS TERMINATED BY to file structure
  5. Use CHARACTER SET utf8mb4 in LOAD DATA command
  6. Wrap reserved words in backticks
  7. Test with 100-row sample before full import

Prepare your CSV properly once—not debug for three hours every import. Use local tools (command-line, text editors, or browser processors) to clean encoding, standardize delimiters, and validate structure without uploading sensitive database content to external servers.

For teams handling transaction data, customer records, or financial information, local CSV processing isn't optional—it's required.


Tools Referenced:

Official Documentation:

Local Processing Tools:

  • Data Cleaner - Encoding conversion, delimiter fixes, validation

All browser-based tools process data entirely in your browser—no uploads, no servers, no data leaving your computer. Essential for protecting database content containing customer PII, transaction details, or confidential business information.


Managing database imports? Connect on LinkedIn or share your workflow at @splitforge.

Fix MySQL Import Errors in Minutes

Convert to UTF-8 without BOM automatically
Detect actual delimiters (comma vs tab vs semicolon)
Clean hidden characters causing parse failures
Process locally—no uploads of sensitive database CSVs

Continue Reading

More guides to help you work smarter with your data

csv-guides

How to Audit a CSV File Before Processing

You inherited a CSV from a vendor. Before you load it into anything, you need to know what's actually in it — without trusting the filename.

Read More
csv-guides

Combine First and Last Name Columns in CSV for CRM Import

Your CRM requires a single Full Name column but your export has First and Last split. Here's how to combine them across 100K rows in 30 seconds.

Read More
csv-guides

Data Profiling vs Validation: What Each Reveals in Your CSV

Everyone says 'validate your CSV before import.' But validation can only check what you already know to look for. Profiling finds what you didn't know to check.

Read More