Recently we have been doing some work with export and import of data from and to MySql databases. Our customer wants to be able to export configuration data from one MySql instance and import it into another.
The obvious tool to use is mysqldump. But since we need to edit the exported data (just replace some codes) before importing it into the other instance we use the --tab option. This means that instead of one big file (or actually redirected standard output) we get two files per table, one with the CREATE TABLE command and one with the data.
In the next step we run some scripts to modify the data. This is quite easy, since data is in text files with one record per row.
Finally we import the data into the new instance. While the default mode of mysqldump creates a file you can run in the command line client, mysqldump --tab does not. You have to create that yourself, but it is not very complicated. You need the following lines for every table:
LOCK TABLES [tablename] WRITE;
ALTER TABLE [tablename] DISABLE KEYS ;
LOAD DATA INFILE '[path][tablename].txt' INTO TABLE [tablename] ;
ALTER TABLE [tablename] ENABLE KEYS ;
The ALTER TABLE and LOCK/UNLOCK commands are not absolutely necessary, but they speed things up. We have not measured the speed of this approach, but the general impression is that it is fast.
Probably this method is faster than the default mode of mysqldump, because data is not sent through the parser, where there has been some performance problems. LOAD DATA INFILE is in most cases very fast.