I added a serial field to a node type that had a few hundred existing nodes and the serial field was added to all those fields no problem. I then added a serial field to another node type with 1500 nodes but it only updated 518 of those.

In the first case, inspection of the DB shows the field in the node type DB table properly filled in. There is also a 2nd table correlating NID and SID that's fine as well. But in the 2nd case, the NID/SID table was filled out (only to 518 records) but the corresponding field in the node type DB table was filled with zeroes (not NULL) even for the 518 records in the SID/NID table.

I'd also recommend giving the option to not populate existing fields, and to offer a starting value other than zero.

CommentFileSizeAuthor
#13 serial.zip2.88 KBdusov
#8 serial.zip11.96 KBkirsh
Support from Acquia helps fund testing for Drupal Acquia logo

Comments

kirsh’s picture

You might have been hit either the 'memory_limit' or the 'max_execution_time' limits.
Please try increasing these values in the settings.php file and then remove and add again the serial field.

druplicate’s picture

Status: Closed (fixed) » Active

I'm already at 1GB PHP memory and 5,000 seconds execution time. I'm lucky I own the server 'cause ya can't do that on a rented box!

I only had to deal with 1,500 records. Anyone with a big site would not be able to use serial field at all because of this issue.

BTW, I got no error and it did not take very long to process the 518 records so I suspect it did not hit any PHP limits. Try populating your test server using the devel module with some fake records and see what happens.

I recommend adding the ability to turn off auto-populating of existing nodes or allowing it to be done in batches.

In my case I really need to start at a base other than zero, like 1,000. Maybe I can do this with the CCK calculated field, but it's an added pain. Should be a standard feature.

EDIT: I take that back - I was running into the PHP memory limit. Then upped it to 3GB and it failed after an hour and 1375 records. Will up it again to 4GB and rerun. This DB table has lots of fields, so it takes a while to save the form.

EDIT2: Was finally able to get all records updated with serial field. This needs to be fixed as the vast majority of people cannot get the server PHP memory resources to manage updating many fields. A similar problem exists for the Migrate module when importing large numbers of nodes. One solution people use is to copy the DB to a rented cloud server to do the job, and then move it back. A pain, but at least it won't bring your server to its knees. This also means freezing your live site so no serial fields get created until you move the DB back, since Serial field cannot incrementally update those new fields.

UPDATE: The Migrate module added code to sense 80% of the PHP memory limit and restart the process with another batch. Can this ability (read steal the code) be included here? For those of us with lots of nodes, this is a necessity.

asb’s picture

subscribing

beckyjohnson’s picture

I wonder if I am having a similar issue. I have 120MB for php memory and no warning up needing to upgrade it from drupal. However, I'm only populating 150 nodes at the most.... Here is my post:
http://drupal.org/node/724644

Becky

Jonasvh’s picture

Status: Active » Fixed

SOLUTION
Increase the PHP Execution time

Add to the .htacces file :

php_value max_execution_time 200

200 is the time in seconds

I just remove the serial field in my contenttype -> save the contenttype -> added the serial field again -> save the contenttype -> just wait, so that the module can execute -> done

check some example: www.geografica.be

Status: Fixed » Closed (fixed)

Automatically closed -- issue fixed for 2 weeks with no activity.

kirsh’s picture

Status: Active » Closed (works as designed)
kirsh’s picture

Version: 6.x-1.0-rc1 » 6.x-1.0
Assigned: Unassigned » kirsh
Status: Closed (works as designed) » Needs review
FileSize
11.96 KB

Attached a new version that uses the batch API to solve this issue. Feedback is welcome.

Note: update.php has to be run because this new version changes the schema.

tcowin’s picture

This new version worked for me on a 6.14 installation. I had a content type table with 3500+ actual distinct nodes, and the public version of serial died after perhaps 500 or so (I forget now) -- it timed out after running for 30 seconds. Backed out of that and used this new code, and doing it in batch mode worked well. This seems like a pretty serious issue for this module for there not to be a caveat or disclaimer on the module's front page?

colan’s picture

Version: 6.x-1.0 » 7.x-1.x-dev
Assigned: kirsh » Unassigned
Status: Needs review » Needs work

Needs a D7 patch first.

lukus’s picture

@colan

I would be happy to create the Drupal 7 version. Would you like me to go ahead?

colan’s picture

Title: Cannot populatae large numbers of existing nodes » Add support for the Batch API
Category: bug » feature

@lukus: No need to ask my permission! Please assign it to yourself and go ahead. ;)

dusov’s picture

Version: 7.x-1.x-dev » 7.x-1.2
FileSize
2.88 KB

For D7
based on Batch example drupal.org/project/examples
Just replace serial.inc

colan’s picture

Version: 7.x-1.2 » 7.x-1.x-dev

@dusov: Please provide code in patch format. Also, new features go into the latest dev branch.

MustangGB’s picture

Issue summary: View changes

Just noting here (for searchability) that you can hackishly disable the automatic conversion something like this:

 function _serial_init_old_entities($entity_type, $bundle, $field_name) {
+  // @todo: This function timesout when there are too many nodes to convert so do it manually instead
+  return 0;
+

Of course you must then manually convert them as the comment indicates, or even better write a patch so we can all benefit from batch API support.