TL;DR - mysqldump is the most reliable way to export a MySQL database, especially for large databases. Basic command:
mysqldump -u user -p database_name > backup.sql- phpMyAdmin provides a visual Export tab that works well for databases under ~512 MB — choose Quick for defaults or Custom to select specific tables, formats, and options. - Browser-based tools like DBEverywhere give you the same phpMyAdmin/Adminer export functionality without installing anything on your machine. - Always include--routinesand--triggersin your mysqldump command, or you will lose stored procedures and triggers in the export. - For databases over 1 GB, use the command line — GUI tools will almost certainly time out.
Table of Contents
- Introduction
- Method 1: Command Line with mysqldump
- Basic Export
- Export with All Objects
- Compressed Export
- Export Specific Tables
- Structure Only or Data Only
- Method 2: phpMyAdmin GUI Export
- Quick Export
- Custom Export
- Exporting as CSV, JSON, or XML
- Method 3: Browser-Based Export with DBEverywhere
- Which Method Should You Use?
- Common Export Issues and How to Fix Them
- FAQ
- Conclusion
How to Export a MySQL Database (3 Methods: GUI, Command Line, and Browser)
Whether you need a backup before a migration, a snapshot for local development, or just a way to move data between servers, knowing how to export a MySQL database is one of those baseline skills every developer needs. The good news: it takes one command or about six clicks, depending on which method you choose.
This guide covers three methods to export your database — the mysqldump command line tool, the phpMyAdmin GUI, and browser-based tools — with exact commands, step-by-step instructions, and a decision guide to help you pick the right approach. Every method produces a standard .sql dump file (or CSV/JSON if you need it) that you can import on any MySQL-compatible server.
Method 1: Command Line with mysqldump
The mysqldump utility ships with every MySQL installation. It is the most widely used tool for exporting MySQL databases, and for good reason: it handles databases of any size, supports every object type, and is scriptable for automated backups. If you have SSH access to your server or MySQL installed locally, this is the method to learn first.
Basic Export
The simplest export command writes every table in a database to a single .sql file:
mysqldump -u your_username -p your_database > backup.sql
Breaking this down:
-u your_username— the MySQL user with read access to the database-p— prompts for the password (do not put the password directly in the command if others can see your shell history)your_database— the name of the database to export> backup.sql— redirects output to a file
After running this, you will have a file called backup.sql in your current directory containing CREATE TABLE statements and INSERT statements for every row.
A note on file sizes: a database with 1 million rows across a dozen tables typically produces a dump file between 200 MB and 800 MB, depending on how much text and blob data it contains. A 10 GB database is not unusual for production systems, and mysqldump handles it fine — the file just takes longer to write.
Export with All Objects (Recommended)
The basic command misses stored procedures, functions, triggers, and events. For a complete export, add these flags:
mysqldump -u your_username -p \
--single-transaction \
--routines \
--triggers \
--events \
--add-drop-table \
your_database > backup.sql
Here is what each flag does:
| Flag | Purpose |
|---|---|
--single-transaction |
Uses a consistent snapshot for InnoDB tables so you get a point-in-time export without locking tables. Essential for production databases. |
--routines |
Includes stored procedures and functions. Without this, they are silently skipped. |
--triggers |
Includes triggers (enabled by default in MySQL 8.0+, but explicit is better). |
--events |
Includes scheduled events created with CREATE EVENT. |
--add-drop-table |
Adds DROP TABLE IF EXISTS before each CREATE TABLE, so the import cleanly replaces existing tables. |
This is the command you should use for production backups. The --single-transaction flag is critical — without it, mysqldump locks each table during the dump, which can block writes on a busy database for minutes or even hours.
Compressed Export
SQL dump files compress extremely well because they are repetitive text. A 500 MB dump file typically compresses to 50-80 MB with gzip — a 6-10x reduction. Pipe the output directly to gzip so you never write the uncompressed file to disk:
mysqldump -u your_username -p \
--single-transaction \
--routines \
--triggers \
your_database | gzip > backup.sql.gz
To import a compressed dump later:
gunzip < backup.sql.gz | mysql -u your_username -p your_database
If you are transferring the file between servers, the smaller size saves real time. Transferring 500 MB over a 100 Mbps connection takes about 40 seconds; transferring 60 MB takes under 5 seconds.
Export Specific Tables
You do not always need the entire database. To export only specific tables, list them after the database name:
mysqldump -u your_username -p your_database users orders order_items > partial_backup.sql
This exports only the users, orders, and order_items tables. Useful when you need test data for local development without pulling down a multi-gigabyte dump.
You can also use --where to filter rows from a specific table:
mysqldump -u your_username -p your_database orders \
--where="created_at >= '2026-01-01'" > recent_orders.sql
Structure Only or Data Only
Sometimes you need the schema without data (for setting up a new environment) or data without the schema (for refreshing a staging server that already has the correct structure).
Structure only — exports CREATE TABLE statements but no INSERT statements:
mysqldump -u your_username -p --no-data your_database > schema_only.sql
Data only — exports INSERT statements but no CREATE TABLE statements:
mysqldump -u your_username -p --no-create-info your_database > data_only.sql
Structure-only files are tiny. A database with 50 tables typically produces a schema dump under 20 KB, regardless of how much data is in the tables.
Exporting to CSV from the Command Line
If you need CSV output instead of SQL, mysqldump is not the right tool. Use the mysql client with the --batch and --raw flags, or the SELECT ... INTO OUTFILE syntax:
SELECT * FROM your_table
INTO OUTFILE '/tmp/your_table.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';
Note: INTO OUTFILE writes to the server's filesystem, not your local machine. You need FILE privilege and access to the server's directory. For most developers, exporting CSV through phpMyAdmin or Adminer is simpler.
Method 2: phpMyAdmin GUI Export
phpMyAdmin provides a visual interface for exporting databases without touching the command line. It is a solid choice for databases under ~512 MB and situations where you want to visually select which tables, formats, or options to include.
Quick Export
The fastest path through phpMyAdmin's export:
- Log in to phpMyAdmin and select your database from the left sidebar.
- Click the Export tab at the top of the page.
- The default Export method is "Quick" — this exports the entire database as SQL with default settings.
- Click Go.
Your browser will download a file named your_database.sql. That is it — four clicks total.
Quick Export uses these defaults:
- Format: SQL
- All tables included
- Structure and data included
- CREATE TABLE statements with IF NOT EXISTS
- No compression
For many use cases, Quick Export is all you need.
Custom Export
For more control, select Custom as the export method. This reveals a full options panel:
- Tables — check or uncheck individual tables. Useful for exporting a subset without guessing command-line syntax.
- Output — choose to save as a file (default) or display in the browser. You can also select compression: none, gzip, or zip.
- Format — SQL is the default. Other options include CSV, JSON, XML, YAML, and LaTeX.
- Format-specific options (for SQL):
- Add DROP TABLE / DROP VIEW — check this if you want a clean re-import.
- Add CREATE PROCEDURE / FUNCTION / EVENT — check this to include stored routines (not included by default in phpMyAdmin's Quick export).
- Add IF NOT EXISTS — prevents errors if a table already exists on the target.
-
Data dump options — choose between
INSERT,INSERT IGNORE, orREPLACEfor how rows are written. -
Click Go to download the export.
Key difference from mysqldump: phpMyAdmin's Quick export does not include stored procedures, functions, or events by default. If your database uses these, always switch to Custom export and explicitly enable them under the "Object creation options" section.
Exporting as CSV, JSON, or XML
phpMyAdmin supports exporting in formats beyond SQL. Under the Custom export method:
- CSV — produces a comma-separated file. Each table is exported as a separate CSV file if you select multiple tables. Useful for importing into spreadsheets or data tools.
- JSON — exports each table as a JSON array of objects. Good for feeding data into APIs or JavaScript-based tools.
- XML — standard XML export with a phpMyAdmin-specific schema. Less commonly used, but supported.
To export a single table as CSV:
- Click the table name in the left sidebar.
- Click the Export tab.
- Change the format to CSV.
- Adjust options: separator character (
,by default), enclosure ("), escape character (\), and column names as the first row. - Click Go.
This is often the easiest way to export a MySQL database as a CSV file when you need data for analysis, reporting, or importing into tools like Excel or Google Sheets.
phpMyAdmin Export Limitations
phpMyAdmin runs as a PHP web application, which means it is subject to PHP and web server limits:
max_execution_time— PHP's default is 30 seconds. Large databases will hit this limit and the export will fail. Self-hosted phpMyAdmin can be configured to increase this (set to 0 for unlimited), but many shared hosting environments cap it at 120-300 seconds.memory_limit— default is 128 MB in most PHP configurations. Exporting a database with large text or blob columns can exhaust memory.upload_max_filesizeandpost_max_size— these affect imports, not exports, but they are the most common source of confusion. If your exported file is 200 MB but your PHPupload_max_filesizeis 50 MB, you will not be able to re-import it through phpMyAdmin without changing the PHP config.- Browser timeout — even if PHP does not time out, your browser's HTTP connection might. Exports over 5-10 minutes are unreliable through a browser.
The practical ceiling for phpMyAdmin exports is roughly 512 MB of database size, depending on your PHP configuration. Beyond that, use mysqldump.
Method 3: Browser-Based Export with DBEverywhere
What if you need the phpMyAdmin or Adminer export interface but do not have phpMyAdmin installed, do not have SSH access for mysqldump, and do not want to spin up Docker just to export a file?
DBEverywhere hosts phpMyAdmin and Adminer in the cloud, giving you full export functionality from any browser without installing anything.
The workflow:
- Open dbeverywhere.com and enter your database host, port, username, and password.
- Click Connect — a phpMyAdmin or Adminer session spins up in under 3 seconds.
- Navigate to the Export tab and use the same Quick or Custom export options described in Method 2 above.
- Download your export file directly to your machine.
Because DBEverywhere runs phpMyAdmin on managed infrastructure, the PHP execution limits are configured for larger exports out of the box — you are not constrained by shared hosting defaults.
This approach solves a specific set of problems:
- No local tools installed. You are on a new laptop, a borrowed machine, or a Chromebook.
- No SSH access. Your database is on a managed service (AWS RDS, DigitalOcean Managed Databases, PlanetScale) where you cannot run mysqldump directly on the server.
- Firewall restrictions. DBEverywhere connects from a static IP address that you can whitelist in your database firewall once and use from anywhere.
The free tier gives you 5 sessions per month, which is enough for occasional exports. The paid tier ($5/mo) gives you unlimited sessions, saved connections, and SSH tunnel support.
How to connect to your database from any browser -->
Which Method Should You Use?
The right method depends on your database size, what tools you have access to, and whether you need automation.
| mysqldump (CLI) | phpMyAdmin (GUI) | DBEverywhere (Browser) | |
|---|---|---|---|
| Best for | Large databases, automation, production backups | Small/medium databases, visual table selection | Quick exports without any installed tools |
| Database size limit | No practical limit | ~512 MB (PHP/browser constraints) | Same as phpMyAdmin, with higher PHP limits |
| Requires SSH access | Yes | No (if phpMyAdmin is already installed) | No |
| Requires local install | MySQL client must be installed | phpMyAdmin must be installed on a server | Nothing — browser only |
| Export formats | SQL only (use mysql client for CSV) |
SQL, CSV, JSON, XML, YAML | SQL, CSV, JSON, XML (via phpMyAdmin/Adminer) |
| Includes routines/triggers | Yes, with --routines --triggers |
Yes, with Custom export options | Yes, with Custom export options |
| Scriptable for cron jobs | Yes | No | No |
| Speed for 1 GB database | 2-5 minutes | May time out | Depends on PHP config; less likely to time out |
| Cost | Free (ships with MySQL) | Free (open source) | Free tier: 5 sessions/mo; Paid: $5/mo |
Use mysqldump when: - Your database is over 500 MB - You need automated, scheduled backups - You have SSH or local terminal access - You want compressed output piped directly to a file
Use phpMyAdmin when: - Your database is under 500 MB - You want to visually select specific tables or formats - You already have phpMyAdmin installed and accessible - You need CSV/JSON/XML output without writing SQL queries
Use DBEverywhere when: - You do not have mysqldump or phpMyAdmin installed - You are working from a device without development tools - Your database is behind a firewall and you need a static IP to connect - You want phpMyAdmin's export interface without the installation overhead
Common Export Issues and How to Fix Them
1. Export times out on large databases
Symptoms: phpMyAdmin shows a blank page, a 504 Gateway Timeout, or an incomplete download.
Fix: Switch to mysqldump. If you must use phpMyAdmin, increase max_execution_time in php.ini:
max_execution_time = 0 ; 0 = no limit
memory_limit = 512M
Then restart your web server. Alternatively, export one table at a time through phpMyAdmin's Custom export.
2. Missing stored procedures or triggers in the export
Symptoms: After importing, your application throws errors about missing functions or triggers.
Fix for mysqldump: Add --routines and --triggers:
mysqldump -u user -p --routines --triggers database_name > backup.sql
Fix for phpMyAdmin: Use Custom export and check "Add CREATE PROCEDURE / FUNCTION / EVENT" under Object creation options.
3. Character encoding issues after import
Symptoms: Non-ASCII characters (accents, CJK characters, emoji) appear as ??? or garbled text after importing.
Fix: Explicitly set the character set in your mysqldump command:
mysqldump -u user -p --default-character-set=utf8mb4 database_name > backup.sql
And when importing, make sure the target database also uses utf8mb4:
mysql -u user -p --default-character-set=utf8mb4 database_name < backup.sql
If you see utf8 instead of utf8mb4 in your dump file, your export used MySQL's legacy 3-byte UTF-8 encoding, which cannot represent characters outside the Basic Multilingual Plane (including emoji). Always use utf8mb4.
4. "Access denied" when running mysqldump
Symptoms: mysqldump: Error: Access denied; you need (at least one of) the PROCESS privilege(s) — this error appeared starting in MySQL 8.0.21.
Fix: Grant the PROCESS privilege to your user, or add --no-tablespaces to skip tablespace information:
mysqldump -u user -p --no-tablespaces database_name > backup.sql
5. The exported SQL file is too large to re-import through phpMyAdmin
Symptoms: phpMyAdmin shows "You probably tried to upload a file that is too large" when importing.
Fix: This is controlled by two PHP settings: upload_max_filesize and post_max_size. Increase them in php.ini:
upload_max_filesize = 256M
post_max_size = 256M
Or import via the command line instead, which has no file size limit:
mysql -u user -p database_name < backup.sql
FAQ
How long does it take to export a MySQL database?
It depends on the size. A 100 MB database exports in under 10 seconds via mysqldump. A 1 GB database takes 2-5 minutes. A 10 GB database can take 15-30 minutes. These times assume the MySQL server and the machine running mysqldump are on the same local network. Over a remote connection with higher latency, expect roughly 2-3x longer.
Can I export a MySQL database without SSH access?
Yes. If phpMyAdmin or Adminer is installed on the server, use the web-based Export tab. If no tools are installed on the server, use DBEverywhere to connect from your browser — it acts as a hosted gateway, so you only need the database hostname, port, and credentials. You do not need SSH access to the server itself.
What is the difference between exporting and backing up a MySQL database?
Functionally, they produce the same output: a .sql dump file that can recreate the database. The difference is intent and practice. A backup is typically automated (via cron), stored redundantly (in S3, a separate server, etc.), and verified periodically. An export is usually a one-time operation — you need the data right now for migration, development, or analysis. The mysqldump command is the same in both cases.
Does mysqldump lock the database during export?
For InnoDB tables (the default engine since MySQL 5.5), using --single-transaction avoids locking. mysqldump starts a transaction with a consistent snapshot and reads all data within that transaction, so writes can continue. For MyISAM tables, mysqldump must lock tables to get a consistent export, which blocks writes. If your database still uses MyISAM, consider converting to InnoDB — there are very few reasons to use MyISAM in 2026.
How do I export a MySQL database as a CSV file?
You have three options. (1) Use phpMyAdmin or Adminer: select your table, click Export, choose CSV format, and download. (2) Use the SELECT ... INTO OUTFILE SQL statement, which writes CSV directly on the server. (3) Use the mysql command-line client with the --batch flag: mysql -u user -p -e "SELECT * FROM table_name" --batch --raw database_name > output.tsv — this produces tab-separated output that you can convert to CSV. For most developers, the phpMyAdmin/Adminer route is the simplest way to export a MySQL database as a CSV file.
Conclusion
Exporting a MySQL database comes down to three methods: mysqldump on the command line, the phpMyAdmin Export tab, or a browser-based tool like DBEverywhere. For large databases or automated backups, mysqldump is the right choice — it has no size limits and supports every MySQL object type. For smaller databases where you want visual control over tables and formats, phpMyAdmin works well. And if you do not have any tools installed, you can export a MySQL database from any browser with DBEverywhere in under a minute.
Whichever method you use, remember three things: include --routines and --triggers (or their phpMyAdmin equivalents), use utf8mb4 for character encoding, and test your export by actually importing it somewhere. A backup you have never tested is not a backup.
Try DBEverywhere free — export your database from any browser
Try DBEverywhere Free
Access your database from any browser. No installation, no Docker, no SSH tunnels.
Get Started