My CSV is quite large and is stopping at 21% after about 1 hour. Went to the product list and saw that exactly 99999 products were imported (?).

I got no errors, no logs, nothing, just the dreaded Ajax error page with the "proceed to the error page" link leading me to the import page with the grey/disabled "Importing - (21%)" button.

Tried as well with the alpha6-(dev) version.

The issue at http://drupal.org/node/1029102 seems unrelated to this one, as the file is already on the server. There it was suggested to do a "hack" to the database, but then it might stop again at 21% after all that time it takes to get there.


Dennis Cohn’s picture

Also problems with large .csv

Warning: fopen(public://feeds/FeedsHTTPFetcherResult1351426570) [function.fopen]: failed to open stream: "DrupalPublicStreamWrapper::stream_open" call failed in ParserCSVIterator->__construct() (line 21 of /home/cohnreis/domains/***/public_html/sites/all/modules/feeds/libraries/ParserCSV.inc).

Dennis Cohn’s picture

Found this patch: http://drupal.org/node/1801680#comment-6551192
I've applied the patch and it works now without any errors

ari-meetai’s picture

That seems the same as http://drupal.org/node/1029102 which is unrelated to the problem I found (the file is already in the server).

ari-meetai’s picture

Issue summary: View changes

English revised

MegaChriz’s picture

Issue summary: View changes
Status: Active » Closed (duplicate)

This is in fact a duplicate of #1029102: Importing Large CSV document (downloaded and processed in the background). When using the HTTP Fetcher and when the import needs multiple cron runs to complete, Feeds makes a copy of the file in public://feeds and calls this file something like FeedsHTTPFetcherResult12345678. As long as the import is not completed, Feeds uses this file to read from. This to prevent import issues when the original file changes in the mean time. But that fetcher result file is marked as a temporary file. By default, Drupal deletes temporary files after six hours. If the import hasn't been completed by then, the file to read from gets deleted and the import process gets broken.