In this article
Why Convert SQL to CSV?
SQL dump files from database exports contain INSERT statements with your data embedded in SQL syntax. While useful for restoring databases, this format is difficult to open in spreadsheet applications, feed into data analysis pipelines, or import into different database systems. Converting SQL to CSV extracts the raw data into a flat, universal format.
CSV files can be opened in Excel, Google Sheets, LibreOffice, and any data analysis tool. They are also the standard import format for most databases, CRMs, and business applications. Converting SQL dumps to CSV makes your data portable and accessible without needing a running database server.
How the Parser Works
The converter parses SQL INSERT statements and extracts the values into structured rows and columns.
- Statement detection -- identifies INSERT INTO statements and extracts the target table name and optional column list
- Value extraction -- parses the VALUES clause, handling quoted strings, numbers, NULL values, and escaped characters correctly
- CSV generation -- maps extracted values to columns and outputs them as properly escaped CSV with headers from the column list or auto-generated column names
Try it free — no signup required
Convert SQL to CSV →Working with SQL Exports
SQL dump files come from various database tools, and the converter handles the common export formats from popular database systems.
- phpMyAdmin exports -- the most common source of SQL dumps for MySQL databases, typically using extended INSERT syntax with multiple value rows per statement
- pg_dump output -- PostgreSQL exports that may include schema definitions, sequences, and INSERT statements with explicit column lists
- MySQL Workbench -- exports that can include CREATE TABLE statements, triggers, and INSERT statements in standard MySQL syntax
- Manual SQL files -- hand-written or script-generated INSERT statements used for data seeding or migrations
Frequently Asked Questions
Which SQL dialects are supported?
The parser handles standard INSERT INTO syntax used by MySQL, PostgreSQL, SQLite, and MariaDB. It supports both single-row and multi-row INSERT statements, backtick-quoted identifiers (MySQL style), double-quote identifiers (PostgreSQL/standard SQL), and square-bracket identifiers (SQL Server). CREATE TABLE and other DDL statements are ignored.
How are NULL values handled in the CSV output?
SQL NULL values are converted to empty cells in the CSV output. This is the standard convention for representing missing data in CSV files. If you need a literal string NULL in the output, the original SQL value must be quoted as a string ('NULL') rather than the SQL keyword NULL.
Can it handle large SQL dump files?
The tool processes files in the browser so performance depends on available memory. Files up to 50 MB typically process without issues. For very large dumps with millions of rows, consider splitting the SQL file first or using a command-line tool like awk to extract specific tables before converting.
What about character encoding?
The converter respects the encoding of the input file. SQL dumps are typically UTF-8 encoded. If your dump uses a different encoding such as latin1 or Windows-1252, convert it to UTF-8 first using a text editor or the iconv command-line tool to ensure special characters are preserved correctly.