Support for Drupal 7 is ending on 5 January 2025—it’s time to migrate to Drupal 10! Learn about the many benefits of Drupal 10 and find migration tools in our resource center.
FeedsProcessor::clean() doesn't use batches even with large amounts of items to be deleted. This causes some deadlocks on one of our servers, so I found out that batching has been suggested for the method. I implemented a working version of using the batch API and now the import works correctly.
Comment | File | Size | Author |
---|---|---|---|
#2 | feeds_clean_batch_2731437-1.patch | 1.58 KB | ZeiP |
Comments
Comment #2
ZeiP CreditAttribution: ZeiP at Avoltus Oy commentedAttached is a working patch that does require some work to fit in properly.
Comment #3
MegaChriz CreditAttribution: MegaChriz as a volunteer commentedGreat! Thanks for working on this. I know though that this issue has been reported before (but without patches if I remember well). If you can find the other issue, you could post your patch there and close this one as a duplicate.
I've yet to try out the patch, but I wondered why you hard-coded
$parser->progress
to be0.53
?It would be great to have an automated test for this. Also, batch deleting could also be implemented for deleting items from the 'Delete items' page (import/x/delete).
Comment #4
MegaChriz CreditAttribution: MegaChriz as a volunteer commentedDuplicate of #2571767: Batch processing for FeedsProcessor::clean() (unpublish/delete non-existent nodes times out with large number of records).