I’m trying to import a large .sql file (270Mb) using SQL Workbench.
I’m exporting from phpmyadmin v3.4.10.1: structure and data, all statements, with syntax INSERT INTO tbl_name (col_A,col_B) VALUES (1,2,3), (4,5,6), (7,8,9)
Does anyone know how I can import it without getting the error?
“The MySQL Workbench SQL Editor is by design not able to load such big scripts. The upper limit is around 100MB, depending on available RAM. A new tool will soon be available to allow executing large scripts (without source
code support, like syntax highlighting, or edit abilities).”
So it looks like you will have to manually go in and break this into several files each less than 100Mb.
What language is being used server-side at the destination? One option provided you’re carefully would be to have the server-side language, work its way through the file and insert the rows in say batches of 50.
You can try the command line utility to see if that works. According to the bug report this seems to be isolated to only MySQL Workbench. It is always worth a try - you never know it might work!!
I’ve confirmed that my csv’s from my last years project were >=140mb. I ran a bash script daily to first run a sqlplus dump from oracle into a csv then then into my local mysql db, never saw an issue other than some isolated data type problems.
If you can connect to the MySQL with the command utility you should really look at using that. I think you will save more time using the command line utility than attempting to split the file up.