I have been using feeds for a site these last few months, I have setup 33 incoming feeds, both CSV and XML.

When I run these through the UI they run perfectly and complete as expected - creating/updating as required.

I have them scheduled to run automatically once a day through cron. Cron is called twice an hour.

When everything is reset, some of the feeds complete but ~9 hang and never complete - stopping any future runs of feeds in the queue.

The feeds source table contains values for state and fetcher result similar to :-


fetcher result:
(although the fetcher result field says it's a (Binary/Image).

There is nothing in the error log, in the feeds log or in watchdog to indicate why they failed.

When I reset the fields by running :-

UPDATE feeds_source SET state = 'b:0;';
UPDATE feeds_source SET fetcher_result = 'b:0;';

The feeds begin again and then eventually stall again.

Can anyone help shed light onto what the problem might be? Are there batch settings I could play with?


rudyard55’s picture

Similar issue. Subscribe

bibo’s picture

This issues might be related to #1553190: data not import during cron run, which has a patch (#7). You should probably try it.

geefin’s picture


Thanks, I already had that patch in place after searching (I think!) exhaustively. I am still having the issue, I get no error messages in the php error log but running cron manually and checking watchdog I receive these logs (in order) :-

Warning: fgets(): supplied argument is not a valid stream resource in FeedsFetcherResult->sanitizeFile() (line 88 of /var/www/vhosts/mydomain.com/httpdocs/modules/feeds/plugins/FeedsFetcher.inc).

Warning: fclose(): supplied argument is not a valid stream resource in FeedsFetcherResult->sanitizeFile() (line 89 of /var/www/vhosts/mydomain.com/httpdocs/modules/feeds/plugins/FeedsFetcher.inc).

Warning: fopen(public://feeds/FeedsHTTPFetcherResult1354905901) [function.fopen]: failed to open stream: "DrupalPublicStreamWrapper::stream_open" call failed in ParserCSVIterator->__construct() (line 19 of /var/www/vhosts/mydomain.com/httpdocs/modules/feeds/libraries/ParserCSV.inc).

Warning: fopen(public://feeds/FeedsHTTPFetcherResult1354905901) [function.fopen]: failed to open stream: "DrupalPublicStreamWrapper::stream_open" call failed in FeedsFetcherResult->sanitizeFile() (line 87 of /var/www/vhosts/mydomain.com/httpdocs/modules/feeds/plugins/FeedsFetcher.inc).

Recoverable fatal error: Argument 2 passed to feeds_tamper_feeds_after_parse() must be an instance of FeedsParserResult, null given in feeds_tamper_feeds_after_parse() (line 19 of /var/www/vhosts/mydomain.com/httpdocs/modules/feeds_tamper/feeds_tamper.module).

Cron errors out and the feeds are hung until I clear the feeds_source fields mentioned earlier.

Running the feeds manually through the /import UI every single one completes as expected without error, running them scheduled with cron a few get through then they hang :( Really would like to get to the bottom of this one :(


geefin’s picture

I found this issue :-

It mentions a 'life' for temporary feed files, they have a lifespan of 6 hours (Drupal core). So if your feed is batched, is very large and takes more than six hours to run (say for example you're running cron once a day, or once an hour) then the file may not exist after a certain amount of time when the process comes back to run it again.

There is a simple patch here :- http://drupal.org/node/1029102#comment-6565716

Implementing this, at first glance, it appears to have freed up the scheduling again. Will see how it goes over the next 48hours and report back...

geefin’s picture

Status:Active» Closed (fixed)

Feeds haven't been hanging since that patch was implemented.

So you can either up the frequency of cron, or, implement the patch mentioned above.

hkovacs’s picture

I encountered the same errors and used the import/unlock tab to release the feed importer. After that the importer resumed running on cron without error.

Also, this was very helpful Feeds import preview