It’s easy to load multiple rows of data from a file into a MySQL table with a single
LOAD DATA INFILE command, but if the file is over a certain size you may want to load it in multiple chunks. When I have a file with millions of lines I typically use
pt-fifo-split to separate the file into multiple chunks in order to limit the size of each transaction. Unlike most tools in Percona Toolkit,
pt-fifo-split doesn’t actually interact with your database, but this is one MySQL use case where it’s very helpful.
Here’s a quick example in bash that illustrates how I can take a large tab-delimited file, break it into multiple 100,000 line chunks, and load the chunks into a table:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
YMMV, so you should do some testing to determine the optimal chunk size based on your schema, storage engine, replication setup, complexity of
LOAD DATA INFILE statement, etc.