How to Backup and Import a Very Large WordPress MySQL Database

I recently had the experience of working with a very large (over 40MB) database for a WPMU + BuddyPress + bbPress site that was just plain too big to export and import with phpMyAdmin. All I wanted to do was create a simple backup and then a test environment on another domain. You may have experienced this before – all the normal tools stall out, making the usual import/export process a frustrating experience. This is especially common with large WPMU sites that have been around for a little while. You’ll want to bookmark this reference if you anticipate having to create and import backups of large WPMU and/or BuddyPress sites.

The solution that works for you will depend entirely on how your server is setup, the size of your database, and what tools you’re comfortable using. Here are a few tricks that you can try to get your backups and imports working. My instructions are based on using PuTTY for Windows but the process will be similar for other operating systems.

If you have root access to your server, create a backup using SSH

cd to the directory where you want to store your dump file:

cd /var/www/vhosts/yourdomain.com/httpdocs/sqldump

Then tell mysql to dump it out:

mysqldump –add-drop-table -u dbuser -p your_db > mybackup.sql

It will prompt you for your password. Everything should dump out without a problem.

*Note: The directory where you’re sending the dump file needs to be writeable by the server. Also, you don’t really want to leave it in that folder permanently. Just drop it there and pull it off your server so that you can manipulate it if you have to.

How to Import a Large SQL File

Importing is usually a bit messier. If you’re duplicating this database for testing purposes or moving your site to a new server, a large dump file, even when gzipped, can pose some challenges in the import process. This is the part where things can really stall out, and you need a few tricks up your sleeve. Here are three methods that are recommended for large SQL files.

Method 1: First try SSH

Ordinarily I would just use phpMyAdmin to import the file. Because it’s so large, the first thing you should do is try importing via SSH:

mysql -u your_database -p db_user < mybackup.sql

Common Error when using SSH:

“got a packet bigger than ‘max_allowed_packet’ bytes”

Now what? Depending on the size of your file, running this command may fix it for you:

mysql> set global max_allowed_packet=1000000000;
mysql> set global net_buffer_length=1000000;

Then try importing your database after this. If it still doesn’t work, you might try splitting it up.

Method 2: If SSH fails on Import, try splitting up your SQL file

Download a program that will split up your SQL file, such as SQLDumpSplitter2. This is a free SQL dump file splitter that will let you determine how big your chops will be and will automatically cut and save your split SQL file. Yes, this is a pain, but it is sometimes the only way to get this accomplished.

Create the splits and then upload them to a directory on your server.

If you want to restore the dump, you have to execute the yourdatabase_DataStructure.sql first because it contains the Tables structure. After that, you can execute the other .sql-files as they contain the data of the now existing tables. Using SSH, CD to your directory again and make sure to send this first:

mysql -u db_user -p db_name < yourbackup_DataStructure.sql

Then your splits:
mysql -u db_user -p db_name < yourbackup_1.sql
mysql -u db_user -p db_name < yourbackup_2.sql

etc…

I would only recommend this if your SQL file is easily split into just a few sections. Obviously if it’s going to take more than a handful, this technique can be time-consuming. I know, because I tried it before I discovered one other method.

Method 3: Use a Script that will stagger the import process

This is my favorite solution, as it was the only one that worked smoothly. I used a script called BigDump.php, a staggered SQL dump importer. It executes only a small part of the huge dump and restarts itself. The next session starts where the last was stopped to keep you from running into your server’s limits. Instructions for using it are on the BigDump website. Basically you place your SQL file in a folder on your server, along with the bigdump.php file. You edit that file with your database information and then visit the page on your server and set the import to go. This is a fairly quick process and will save you a lot of time.

There are some smaller things that you can do to help reduce the size of your database, which we outlined in an article called 8 Tips For Keeping a Squeaky Clean WordPress Database. Backup your database and look into some of those to trim it down.

I’m sure there are other tricks to try to get past the limits for large files, so please feel free to post in the comments if you know of anything else that I’m forgetting or might help other WordPress users.

Comments (31)

  1. It’s all true, but if you’ve got a database over 1Gb large you’ll have quite a challenge backing it up ;)

    Yeah, I know, MultiDB. But that also doesn’t help much: the overall size remains the same, and the server is busy with backup and not with serving visitors.

    I still haven’t found a good way out, but I keep looking for it.
    BTW, a promising mysqlhotcopy wasn’t of any help too.

  2. I think you have an error on your example query:

    “Ordinarily I would just use phpMyAdmin to import the file. Because it’s so large, the first thing you should do is try importing via SSH:

    mysql -u your_database -p db_user < mybackup.sql"

    Isn't it -u username -p database?

  3. But you have it in a different order in examples #1 and #2, see:

    mysql -u your_database -p db_user < mybackup.sql

    and then

    mysql -u db_user -p db_name < yourbackup_DataStructure.sql

    See what I mean?

  4. I’ve been using BigDump for years and feel good with that. But you can use the traditional tool: phpMyAdmin, the new version can automatically divide sql file into parts and import them it sql file is too big.

    And one more thing, I often get error when import (such as “duplicate row”). So before backup database, I have to fix it with the “repair” command :)

  5. Adrian is quite right here. There is an error:

    “Ordinarily I would just use phpMyAdmin to import the file. Because it’s so large, the first thing you should do is try importing via SSH:
    mysql -u your_database -p db_user < mybackup.sql"

    The answer to Adrian is incorrect too:

    "Adrian – The first one is a Backup. The second one is an Import."

    Both things are imports, since mysql can not do a dump (export, back-up), it only imports.

    So once again what Adrian says:

    "But you have it in a different order in examples #1 and #2, see:
    mysql -u your_database -p db_user < mybackup.sql
    and then
    mysql -u db_user -p db_name < yourbackup_DataStructure.sql
    See what I mean?"

    Reference here:
    dev.mysql.com/doc/refman/5.0/en/mysql-command-options.html#option_mysql_user

  6. I had a similar scenario. I needed to backup a large Mysql database but I didn’t have access to the server to run the mysqldump command or use the bigdump script. I ended up using a backup tool MySQLBackupFTP (http://mysqlbackupftp.com).I used the free version to connect to MySQL through phpMyAdmin. The tool also compress the backup and can send it to a remote FTP server. It’s useful when you don’t have privileges on the server.

    By the way, you mentioned another tool (SQLDumpSplitter2) to split the backup file but it would be easier to use 7zip and slipt it into several .7z files.

    • I also discovered Sypex Dumper. It’s a PHP script that dumps very large databases. I’ve tested it (not a WordPress installation) up to several GB. Of course the Lite version (free), is somewhat slower than mysqldump. They say the full version is almost equal to it, if not faster. But I never tried it, since the Lite is enough for me.

  7. Hey, very nice article. Just what I needed today! And thanks to Joseph Ugoretz tipping about a Mac version! My clients site got hacked yesterday, but yes, I had a copy av the SQL file, witch of course is to big for import…

  8. There is another great solution to backup your website. allow incremental and full files backup, database backup without mysqldump that do not lock the access to database during backup. The backup can be encrypted, a good solution to protect the low security md5 wordpress password encryption. The archives can be sent to Email, Amazon S3, email or locally, Dropbox is actually in development. It integers a restoration system without files manipulation, downloading automatically backups from storages.

    http://touchwebsitesolutions.com

  9. When importing into Sequel Pro, I get errors. Probably because the settings in the database or specific to the remote server. I’m trying to pull this WordPress DB down and work on it locally. Any ideas?

Participate