Here is what I would do:
1] Truncate the log tables.
Create a script and put it in shell/housekeeping.php
<?php doSomeHouseKeeping(); function doSomeHouseKeeping() { $xml = simplexml_load_file(dirname(__FILE__) . '/../app/etc/local.xml', NULL, LIBXML_NOCDATA); if(is_object($xml)) { $db['host'] = $xml->global->resources->default_setup->connection->host; $db['name'] = $xml->global->resources->default_setup->connection->dbname; $db['user'] = $xml->global->resources->default_setup->connection->username; $db['pass'] = $xml->global->resources->default_setup->connection->password; $db['pref'] = $xml->global->resources->db->table_prefix; $tables = array( 'dataflow_batch_export', 'dataflow_batch_import', 'log_customer', 'log_quote', 'log_summary', 'log_summary_type', 'log_url', 'log_url_info', 'log_visitor', 'log_visitor_info', 'log_visitor_online', 'index_event', 'report_event', 'report_viewed_product_index', 'report_compared_product_index', 'catalog_compare_item', 'catalogindex_aggregation', 'catalogindex_aggregation_tag', 'catalogindex_aggregation_to_tag' ); mysql_connect($db['host'], $db['user'], $db['pass']) or die(mysql_error()); mysql_select_db($db['name']) or die(mysql_error()); foreach($tables as $table) { @mysql_query('TRUNCATE `'.$db['pref'].$table.'`'); } } else { exit('Unable to load local.xml file'); } }
And run it as:
php -f shell/housekeeping.php
Note: You can use MageRun tool but I won't recommend it for production environment.
2] Dump the mysql using gzip compression:
mysqldump -u [user] -p[pass] [database] | gzip > [dump-file-name].sql.gz
3] Import the dumped file to your destination server:
gunzip < [dump-file-name].sql.gz | -u [user] -p[pass] [database]
mysql -e "SELECT CONCAT(TABLE_SCHEMA, ".", TABLE_NAME) AS tbl, TABLE_ROWS AS nrows, ((DATA_LENGTH + INDEX_LENGTH)/1024/1024) AS MiB FROM TABLES ORDER BY MiB DESC LIMIT 10;" information_schema.sales_quotetables can fill up if its a very active store: github.com/fbrnc/Aoe_QuoteCleaner