I added a serial field to a node type that had a few hundred existing nodes and the serial field was added to all those fields no problem. I then added a serial field to another node type with 1500 nodes but it only updated 518 of those.
In the first case, inspection of the DB shows the field in the node type DB table properly filled in. There is also a 2nd table correlating NID and SID that's fine as well. But in the 2nd case, the NID/SID table was filled out (only to 518 records) but the corresponding field in the node type DB table was filled with zeroes (not NULL) even for the 518 records in the SID/NID table.
I'd also recommend giving the option to not populate existing fields, and to offer a starting value other than zero.
Comment | File | Size | Author |
---|---|---|---|
#13 | serial.zip | 2.88 KB | dusov |
#8 | serial.zip | 11.96 KB | kirsh |
Comments
Comment #1
kirsh CreditAttribution: kirsh commentedYou might have been hit either the 'memory_limit' or the 'max_execution_time' limits.
Please try increasing these values in the settings.php file and then remove and add again the serial field.
Comment #2
druplicate CreditAttribution: druplicate commentedI'm already at 1GB PHP memory and 5,000 seconds execution time. I'm lucky I own the server 'cause ya can't do that on a rented box!
I only had to deal with 1,500 records. Anyone with a big site would not be able to use serial field at all because of this issue.
BTW, I got no error and it did not take very long to process the 518 records so I suspect it did not hit any PHP limits. Try populating your test server using the devel module with some fake records and see what happens.
I recommend adding the ability to turn off auto-populating of existing nodes or allowing it to be done in batches.
In my case I really need to start at a base other than zero, like 1,000. Maybe I can do this with the CCK calculated field, but it's an added pain. Should be a standard feature.
EDIT: I take that back - I was running into the PHP memory limit. Then upped it to 3GB and it failed after an hour and 1375 records. Will up it again to 4GB and rerun. This DB table has lots of fields, so it takes a while to save the form.
EDIT2: Was finally able to get all records updated with serial field. This needs to be fixed as the vast majority of people cannot get the server PHP memory resources to manage updating many fields. A similar problem exists for the Migrate module when importing large numbers of nodes. One solution people use is to copy the DB to a rented cloud server to do the job, and then move it back. A pain, but at least it won't bring your server to its knees. This also means freezing your live site so no serial fields get created until you move the DB back, since Serial field cannot incrementally update those new fields.
UPDATE: The Migrate module added code to sense 80% of the PHP memory limit and restart the process with another batch. Can this ability (read steal the code) be included here? For those of us with lots of nodes, this is a necessity.
Comment #3
asb CreditAttribution: asb commentedsubscribing
Comment #4
beckyjohnson CreditAttribution: beckyjohnson commentedI wonder if I am having a similar issue. I have 120MB for php memory and no warning up needing to upgrade it from drupal. However, I'm only populating 150 nodes at the most.... Here is my post:
http://drupal.org/node/724644
Becky
Comment #5
Jonasvh CreditAttribution: Jonasvh commentedSOLUTION
Increase the PHP Execution time
Add to the .htacces file :
php_value max_execution_time 200
200 is the time in seconds
I just remove the serial field in my contenttype -> save the contenttype -> added the serial field again -> save the contenttype -> just wait, so that the module can execute -> done
check some example: www.geografica.be
Comment #7
kirsh CreditAttribution: kirsh commentedComment #8
kirsh CreditAttribution: kirsh commentedAttached a new version that uses the batch API to solve this issue. Feedback is welcome.
Note: update.php has to be run because this new version changes the schema.
Comment #9
tcowin CreditAttribution: tcowin commentedThis new version worked for me on a 6.14 installation. I had a content type table with 3500+ actual distinct nodes, and the public version of serial died after perhaps 500 or so (I forget now) -- it timed out after running for 30 seconds. Backed out of that and used this new code, and doing it in batch mode worked well. This seems like a pretty serious issue for this module for there not to be a caveat or disclaimer on the module's front page?
Comment #10
colanNeeds a D7 patch first.
Comment #11
lukus@colan
I would be happy to create the Drupal 7 version. Would you like me to go ahead?
Comment #12
colan@lukus: No need to ask my permission! Please assign it to yourself and go ahead. ;)
Comment #13
dusov CreditAttribution: dusov commentedFor D7
based on Batch example drupal.org/project/examples
Just replace serial.inc
Comment #14
colan@dusov: Please provide code in patch format. Also, new features go into the latest dev branch.
Comment #15
MustangGB CreditAttribution: MustangGB commentedJust noting here (for searchability) that you can hackishly disable the automatic conversion something like this:
Of course you must then manually convert them as the comment indicates, or even better write a patch so we can all benefit from batch API support.