之前工作中遇到一個問題,就是導(dǎo)入大容量sql文件,sql文件大概有2個G,包含了表結(jié)構(gòu)和數(shù)據(jù)。當時用命令行、workbench等工具都試過了,都會出現(xiàn)各種各樣的錯誤,因為比較著急,后來實在沒辦法,就手工刪除了很多記錄,然后再導(dǎo)入了。
想問下,大家再導(dǎo)入大容量SQL文件的時候,都是通過什么方法?文件或命令行的方式都行
光陰似箭催人老,日月如移越少年。
For 6 GB sql files, I directly use source in mysl, which takes about half an hour
Save the world via command line
mysql -h hostname -u username -p databasename < sqlfile.sql
Try splitting the file into several parts
I handled a similar case in the first half of the year, and the SQL kept making errors.
The final solution is: convert the data into an xml or json file, and finally import it using navicat.
I put the sql file into the server and source it in the server. As long as there are no errors in the SQL, it will be fast every time