Last updated June 4, 2008. Created on June 4, 2008.
Edited by renaud.richardet. Log in to edit this page.

If you have to migrate a large sql batch file from your old server, chances are that the script in phpmyadmin will time out and only part of your database will be migrated. The script below will let you split a sql batch file in small chunks. It is intended to be run on your local web server (or CL).

<?php
print "hello<br>";

$splitEvery = 840000;
$base = "/home/ren/dev/ms/";
$ext  = '.sql';
$file_name = $base . 'db' . $ext;
 

// (the first) output file
$outCnt = 1;
$out = fopen($base . 'db_' . $outCnt . '.sql', "w");

// read input file
$fp = fopen($file_name, "r");
while(
$line = fgets($fp)) {

 

$strLen += strlen($line);
  if (
$strLen > $splitEvery  && preg_match("/(^[\r\n]*|[\r\n]+)[\s\t]*[\r\n]+/", $line) ) {
    print
'empty line after ' . $strLen . '<br>';
   
$strLen = 0;

   

//new output file
   
fclose($out);
   
$outCnt++;
   
$out = fopen($base . 'db_' . $outCnt . '.sql', "w");
  }

 

//write to current out file
 
fwrite($out, "$line");
}

fclose($fp);

print

"done"; ?>

Looking for support? Visit the Drupal.org forums, or join #drupal-support in IRC.

Comments

jdwalling’s picture

How to prevent script timeout in phpMyAdmin
http://www.deuxcode.com/articles/091/how-to-prevent-script-timeout-in-ph...

BigDump: Staggered MySQL Dump Importer
http://www.ozerov.de/bigdump.php