MySQL import database but ignore specific table

mysqldump ignore-table
ignore-table
mysql restore exclude table
mysqldump exclude database
mysql dump database to file exclude table
mysqldump ignore specific table
mysql import ignore errors
skip table mysql

I have a large SQL file with one database and about 150 tables. I would like to use mysqlimport to import that database, however, I would like the import process to ignore or skip over a couple of tables. What is the proper syntax to import all tables, but ignore some of them? Thank you.

mysqlimport is not the right tool for importing SQL statements. This tool is meant to import formatted text files such as CSV. What you want to do is feed your sql dump directly to the mysql client with a command like this one:

bash > mysql -D your_database < your_sql_dump.sql

Neither mysql nor mysqlimport provide the feature you need. Your best chance would be importing the whole dump, then dropping the tables you do not want.

If you have access to the server where the dump comes from, then you could create a new dump with mysqldump --ignore-table=database.table_you_dont_want1 --ignore-table=database.table_you_dont_want2 ....


Check out this answer for a workaround to skip importing some table

MySQL import database but ignore specific table, I have a large SQL file with one database and about 150 tables. I would like to use mysqlimport to import that database, however, I would like  Import database. Change your command to mysql and add sign < to import data. mysql -u your_username -p your_database_name < data_to_import.sql Import multiple .sql files. Use command cat *.sql before your mysql command to filter all the .sql files to process. cat *.sql | mysql -u your_username -p your_database_name Import only specific tables from .sql file

The accepted answer by RandomSeed could take a long time! Importing the table (just to drop it later) could be very wasteful depending on size.

For a file created using

mysqldump -u user -ppasswd --opt --routines DBname > DBdump.sql

I currently get a file about 7GB, 6GB of which is data for a log table that I don't 'need' to be there; reloading this file takes a couple of hours. If I need to reload (for development purposes, or if ever required for a live recovery) I skim the file thus:

sed '/INSERT INTO `TABLE_TO_SKIP`/d' DBdump.sql > reduced.sql

And reload with:

mysql -u user -ppasswd DBname < reduced.sql

This gives me a complete database, with the "unwanted" table created but empty. If you really don't want the tables at all, simply drop the empty tables after the load finishes.

For multiple tables you could do something like this:

sed '/INSERT INTO `TABLE1_TO_SKIP`/d' DBdump.sql | \
sed '/INSERT INTO `TABLE2_TO_SKIP`/d' | \
sed '/INSERT INTO `TABLE3_TO_SKIP`/d' > reduced.sql

There IS a 'gotcha' - watch out for procedures in your dump that might contain "INSERT INTO TABLE_TO_SKIP".

How to skip already created table while importing dump file in MySql , You can use mysqldump --ignore-table=<database>.<table> to create a dump that does not include the access_log table at all. was the Skip certain tables with mysqldump Stack Overflow question, and its accepted answer. mysql dump - exclude some table data. Is it possible, using mysql dump to export the entire database structure, but exclude certain tables data from export. Say the database has 200 tables, I wish to export the structure of all 200 tables, but i want to ignore the data of 5 specific tables.

For anyone working with .sql.gz files; I found the following solution to be very useful. Our database was 25GB+ and I had to remove the log tables.

gzip -cd "./mydb.sql.gz" | sed -r '/INSERT INTO `(log_table_1|log_table_2|log_table_3|log_table_4)`/d' | gzip > "./mydb2.sql.gz"

Thanks to the answer of Don and comment of Xosofox and this related post: Use zcat and sed or awk to edit compressed .gz text file

How to remove few tables from a large SQL file?, one table while dumping a MySQL database: mysqldump -u user_name -​pyour_password --ignore-table=db_name.table_name > dump.sql. If you are restoring from a dump file, you can easily build a new dumpfile without this table, just by writting down the line numbers. Initial line > grep dumpfile.sql -ne "Dumping data for table \`avoid_tablename\`" -m 1 43:-- Dumping data for table `avoid_tablename` Total lines > wc -l dumpfile.sql 63 dumpfile.sql Make a new file

If desired, you can do this one table at a time:

mysqldump -p sourceDatabase tableName > tableName.sql
mysql -p -D targetDatabase < tableName.sql

MySQL, Another option would be a script of some sort, but before checking that route I'll try You can use the --ignore command line switch dump the given table, which must be specified using both the database and table names. shell> mysqldump -u backupuser -p database --ignore-table=database.huge_table > db_rest.sql. Import data to the MySQL server on the given host. The default host is localhost. --ignore, -i. See the description for the --replace option. --ignore-lines=N. Ignore the first N lines of the data file. --lines-terminated-by=

Here is my script to exclude some tables from mysql dump I use it to restore DB when need to keep orders and payments data

exclude_tables_from_dump.sh

#!/bin/bash

if [ ! -f "$1" ];
then
    echo "Usage: $0 mysql_dump.sql"
    exit
fi

declare -a TABLES=(
user
order
order_product
order_status
payments
)

CMD="cat $1"
for TBL in "${TABLES[@]}";do
    CMD+="|sed 's/DROP TABLE IF EXISTS \`${TBL}\`/# DROP TABLE IF EXIST \`${TBL}\`/g'"
    CMD+="|sed 's/CREATE TABLE \`${TBL}\`/CREATE TABLE IF NOT EXISTS \`${TBL}\`/g'"
    CMD+="|sed -r '/INSERT INTO \`${TBL}\`/d'"
    CMD+="|sed '/DELIMITER\ \;\;/,/DELIMITER\ \;/d'"
done

eval $CMD

It avoid DROP and reCREATE of tables and inserting data to this tables. Also it strip all FUNCTIONS and PROCEDURES that stored between DELIMITER ;; and DELIMITER ;

mysqldump skip one table, It dumps one or more MySQL databases for backup or transfer to another SQL --disable-keys, For each table, surround INSERT statements with statements to but it also writes an SQL comment containing the view definition to the dump  Unless you have ignored the tables during the dump with mysqldump --ignore-table=database.unwanted_table, you have to use some script or tool to filter out the data you don't want to import from the dump file before passing it to mysql client. Here is a bash/sh function that would exclude the unwanted tables from a SQL dump on the fly (through pipe):

MySQL 8.0 Reference Manual :: 4.5.4 mysqldump, These were intended for replacing the older options of --include , --databases To select specific data to be backed up or restored, use the partial backup and restore Exclude for backup or restoration all tables (both Innodb and non-​Innodb) Besides the legacy options, some other options are also discussed below, but  Import CSV File Into MySQL Table This tutorial shows you how to use the LOAD DATA INFILE statement to import CSV file into MySQL table. The LOAD DATA INFILE statement allows you to read data from a text file and import the file’s data into a database table very fast.

Partial Backup and Restore Options, mysqldump can retrieve and dump table contents row by row, or it can retrieve the entire With --force , mysqldump prints the error message, but it also writes an SQL comment --ignore-database=name, Do not dump the specified database. For an overview of the data export and import options in MySQL Workbench, see Section 6.5, “Data Export and Import”. The wizard is accessible from the object browser's context menu by right-clicking on a table and choose either Table Data Export Wizard or Table Data Import Wizard , as the next figure shows.

mysqldump, Execute the following command to export the database except for a few tables. $ mysqldump -uUSERNAME -p DB_NAME --ignore-table=DB_NAME. For each text file named on the command line, mysqlimport strips any extension from the file name and uses the result to determine the name of the table into which to import the file's contents. For example, files named patient.txt , patient.text, and patient all would be imported into a table named patient .

Comments
  • i don't know if that is possible. can you create a backup of the .sql, seek them out in the text and clobber them or is this something you are going to be doing again and again and again. you could a CTRL-H for find and replace with a REM stmt, or at the end run a little "delete from table1;" "delete from table2" sorta script
  • Are there a lot that need to be omitted? And how are they identified? Maybe consider importing the whole thing, then run a delete table on the ones you don't want to keep.
  • I think that there may be a problem with one of the tables in the SQL file, so my import process does not complete. Do you know whether using the import process from the command line is different from using import utility from MySQL Workbench?
  • Thanks for the comment. My problem seems to be that when I use the import command from within MySQL Workbench, this process works but only imports about a third of my tables and then hangs on a specific table. I can't get all the tables imported because of the problem table.
  • Try to import your dump directly from the command line. Are you getting the same issue? At least you might be able to see some error message.
  • This is genius, 1 hour down to 30 seconds.
  • wow!!! awesome, i've just reduced a backup from 1GB to 72MB, save me a lot of time, thanks
  • Consider the sed -r option and a RegExp in case you want to eliminate more tables in one go, like: sed -r '/INSERT INTO `(TABLE1_TO_SKIP|TABLE2_TO_SKIP)`/d' DBdump.sql > reduced.sql
  • just FLY: TABLE_TO_SKIP is case-sensitive, it's usually table_to_skip :)
  • for windows find for "sed for windows" - now it can be found here: gnuwin32.sourceforge.net/packages/sed.htm - and use double quot instead of single quote for the command