Hi,
I'm trying to optimise my setup of a Drupal Batch process.
I am reading in thousands of rows of data from an Excel spreadsheet, from multiple worksheets, and creating nodes from the Rows. My import functions work fine, and I can create a Batch, but at present I am creating a batch operation for each row. This means that my batch is very slow (because I'm having to bootstrap Drupal on every row).
What I can't seem to figure out is a way to setup my batch to deal with 'chunks' rather than individual rows.
An example of my import function (of which there are 18) is:
function governbim_coordinate($value){
module_load_include('inc','phpexcel');
global $user;
$project = $_SESSION['governbim']['projnid'];
//Load the required worksheet from the excel file passed to the function as $value
$result = phpexcel_import($value,TRUE,TRUE,array('setLoadSheetsOnly' => array('Coordinate')));
//Define some common values for our entity
$values = array(
'type' => 'coordinate',
'uid' => $user->uid,
'status' => 1,
'comment' => 0,
'promote' => 0,
);
//step through each row in the worksheet
foreach($result['Coordinate'] as $coordinate){
$title = $coordinate['Name'];
//check to see if this is a new row or if we are updating an existing entity
$action = governbim_node_actions_node_check('assembly',$title);
switch($action['op']){