I was attempting an export to get a model of XML data for import (importing similar types of nodes).

After selecting XML export and leaving all the fields at their defaults, I clicked Export and got

Fatal error: Allowed memory size of 16777216 bytes exhausted (tried to allocate 46 bytes) in [drupal dir]/includes/database.mysql.inc on line 136

I wasn't surprised - there's a LOT of data - but it seems like the module needs to at least be able to exit gracefully, and perhaps allow exporting in chunks when there's too much data for memory to handle.

(I know I could try bumping up memory, but I don't think everyone has that option, depending on their hosting service.)

Thanks!

Kristi

Comments

Jaza’s picture

Title: fatal error on large export » Better handling of large exports needed
Category: bug » task
Priority: Critical » Normal

Thanks for pointing this out, Kristi. However, this is not a bug, it's simply a limitation of the current UI (and, to some extent, the API) in not allowing for splitting up exports into chunks. Better handling of large exports is certainly something that I'll be looking into when I have the time.

neofactor’s picture

Could the export be saved to a text file instead of in a form field?
incrimental dumps to a text file may cut down on the memmory issue.

This could be tricky because of the nature of XML and you need to know all of the elements before you can start saving... so I guess this might not be a fix for that... but saving to a file would be a great option to have.

jamesJonas’s picture

Thanks for the module.

Just installed. Performed several simple exports (book - small and tax - medium). Worked great. Failed on large export of pages (800 ish).

PHP.ini
memory_limit = 80M
Fatal error: Allowed memory size of 83886080 bytes exhausted (tried to allocate 79 bytes) in /var/www/html/drupal/modules/importexportapi/engines/importexportapi_db_get.inc on line 439

Increased memory_limit to 160M
Fatal error: Allowed memory size of 167772160 bytes exhausted (tried to allocate 256 bytes) in /var/www/html/drupal/includes/form.inc on line 42

Yep, thats 160M.

(1) You might take a look at using a XML index file as google has done with sitemaps. It uses a single xml index file that references a set of other xml files. You may also consider allowing for gzip files for both import and export.

Google:
Using Sitemap index files (to group multiple sitemap files)
https://www.google.com/webmasters/sitemaps/docs/en/protocol.html

Drupal Module Code:
gsitemaps.module: http://drupal.org/project/gsitemap
sitemap index patch: This seems to be working on my site.
http://drupal.org/node/79583

This type of strategy could would for large imports and exports.

sun’s picture

Title: Better handling of large exports needed » Memory usage & timeout issues / Better handling of large data needed
Category: task » feature

http://drupal.org/node/123380
http://drupal.org/node/130753
http://drupal.org/node/172372
http://drupal.org/node/141906
have been marked as duplicates of this issue.

Handling a lot of data is a feature, not a bug.

ssdhaliwal’s picture

Project: Import / Export API »
Version: master » 5.x-1.x-dev
Category: feature » bug
Priority: Normal » Critical

First, I did not know if my priority is same as support for this issue, but to me it is critical :), because the site cannot install any more modules.

I am trying to install CCK so I can create views and custom content and am getting timeout issue: Below is the log output. I updated the PHP.INI as shown below.

>> log
[03-Feb-2008 08:17:55] PHP Fatal error: Maximum execution time of 30 seconds exceeded in E:\Inetpub\www\mko\includes\file.inc on line 646

>> PHP.INI
max_execution_time = 12000 ; Maximum execution time of each script, in seconds
max_input_time = 12000 ; Maximum amount of time each script may spend parsing request data
memory_limit = 64M ; Maximum amount of memory a script may consume (8MB)

I tried to look for 30 in all of my configs and could not find one (all of my default timeouts are 60). Problem is now I cannot even install any more modules - the site is frozen.

I am running WAMP5 with (PHP 5.2.5.5, Apache 2.2.6.0); and have checked the entire drive for duplicate php.ini's (one in PHP and one in Apache\bin) - I made them the same... it is still now to working.

sun’s picture

Project: » Import / Export API
Version: 5.x-1.x-dev » master
Category: bug » feature
Priority: Critical » Normal

@ssdhaliwal: Please do not move issues to other queues unless a follow-up clearly indicates that an issue is assigned to the wrong module. My previous follow-up contained several issues from Import/Export API's queue that have been marked as duplicate of this issue. So this issue is important for Import/Export API's users. Please create a new your bug report in the issue queue of Module Installer.

xjm’s picture

Any action on this? In the meanwhile, can anyone recommend a workaround or hack for exporting nodes from content types that include a large number of nodes?

Anonymous’s picture

Version: master » 6.x-1.x-dev
Category: feature » task

@xjm: Nothing yet. Perhaps batch processing with a specified number of rows to create the file where the file is appended to in each batched cycle. Do you care to look at creating a patch?

DamienMcKenna’s picture

Could it also be the memory leak in PHP where it doesn't flush correctly when using StdClass (#664940: investigate general php memory consumption fixes)?