I am in the process of building a website, and therefore, it doesn't have any content yet.

I have a taxonomy vocabulary which consists of approx 50,000 taxonomy terms, which are about eight levels deep.

As far as I am informed; Drupal is a machine that can handle any number of terms in a vocabulary... but to my surprise; the website got slower with the more terms I added until finally I got an error "Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 262144 bytes) in /usr/www/users/magicq/modules/taxonomy/taxonomy.module on line 866”

I know that I can just increase the memory in order for it to work; I tested it – and it worked if I increase the limit to 256MB; but this is not a “Solution” as the site is still very slow and no website should require a 256MB limit to work.

I have not implemented any funny stuff yet such as taxonomy menus; in fact, the terms are not accessible on the site at all; therefore, there shouldn’t be any reason for the taxonomy to have any influence on the site; but for some reason, it does.

I installed the “Taxonomy Manager” module to manage the vocabulary.

I have a rockettheme theme which I use for the website – I am not sure if the theme might be the culprit loading the taxonomy vocabulary for no reason... your advice will be appreciated.

After doing some research I found that this problem might be caused by the taxonomy_get_tree() function. I am aware that the system will be slow if I were to use a drop down for selecting terms when creating nodes; which is why I implemented the “Hierarchical Select” module in order to reduce the load?

Is there a way to prevent the “taxonomy_get_tree()” function from running? Alternatively, tweak the function to run faster / less memory intensive?

Will it help to cache the taxonomy?

What is my best play?

Any suggestions / pointers will be greatly appreciated.

MySQL database 5.0.51a
PHP 5.2.6-1+lenny9
PHP memory limit 128M
PHP register globals Disabled
Unicode library PHP Mbstring Extension
Web server Apache/2.2.9 (Debian) mod_ssl/2.2.9 OpenSSL/0.9.8g mod_perl/2.0.4 Perl/v5.10.0

Comments

cafewebmaster’s picture

Priority: Normal » Critical

We have the same problem even with 1GB ram.

It appears under many different paths like "/admin/content/node", "admin/settings/performance"-> clear cache.

Alessandraaa’s picture

Same issue here

subscribing

jem500’s picture

Same issue.

Subscribing.

Eric_A’s picture

Is there a way to prevent the “taxonomy_get_tree()” function from running? Alternatively, tweak the function to run faster / less memory intensive?

Core D6 taxonomy does provide a mechanism. You may or may not be better off with D7, where on the one hand terms were turned into full blown entities, but on the other hand many issues on taxonomy performance were fixed. See for example #693362: taxonomy_form_all() is dangerous.

This is from D6.

// taxonomy_get_tree and taxonomy_get_parents may contain large numbers of
  // items so we check for taxonomy_override_selector before loading the
  // full vocabulary. Contrib modules can then intercept before
  // hook_form_alter to provide scalable alternatives.

See http://api.drupal.org/api/drupal/modules--taxonomy--taxonomy.admin.inc/f...

Wether any of the mentioned contrib projects improve performance or make things worse I do not know.

Eric_A’s picture

This issue may or may not be duplicate of #556842: taxonomy_get_tree() memory issues...

molave’s picture

Same issue for me on Drupal 7. I have a new (test) site, on a shared hosting server, with very little content (maybe 3 or 4 nodes only).

I successfully uploaded a new taxonomy vocabulary (using Taxonomy CSV) of 450 terms, which is 6 levels deep. But now, whenever I try to list the taxonomy terms, I get an "Allowed memory size exceeded" error.

According to Status Report, my current PHP memory limit is 128 MB. Was unsuccessful increasing memory limit via settings.php. Was also unable to adjust memory limit using htaccess file. Have not yet tried asking webhost to adjust server settings.

Interesting note: upon uploading another vocabulary of only 100 taxonomy terms, I did not have a problem listing the terms of the second vocab. The larger vocab still returns errors.

I'm wondering if it's not just the size, but also the depth of the first taxonomy (6 levels of parent-child relationships), that plays a part in the memory problem.

Thanks.

wojtha’s picture

Status: Active » Closed (duplicate)

This is duplicate of #556842: taxonomy_get_tree() memory issues - fixed in D7 and D8 and patch for D6 also exists.