I spent the entire day trying to import a 1GB geospatial raw txt file into various DBs so I can trim it down to my needs. It seems like, when it comes to large files, there’s no easy solution. The one that worked best was a mysql command line import function:

c:\xampp\mysql\bin>mysqlimport –fields-terminated-by=”\t” –user=webadmin –password=pw DB_NAME TEXT_FILE_NAME.txt

Here, TEXT_FILE_NAME should be the same as the name of the table and the TEXT_FILE_NAME.txt file should be placed in the data\DB_NAME directory: \mysql\data\DB_NAME
I managed to import 7 million rows, each having 26 fields, in about a minute.

Additionally, to export the db:

c:\xampp\mysql\bin>mysqldump -u webadmin -pPassword DB_NAME > DUMP_FILE_NAME.sql

Advertisements