This forum is for less technical discussions about the Drupal project, not for support questions.

Error: An HTTP error undefined occured. with autocomplete

I get
An HTTP error undefined occured.
/drupal/taxonomy/autocomplete/4
when i try to type in the relevant box
on /node/edit
-categories

any idea why it occurs? has anyone has autocomplete working or similar error or a fix to this?

its on 4.7b3

event.theme cannot be found?

I edited my event.theme file but made a back up. When I switched to my back up copy. My site could no longer see the event.theme file. The name, ownership and permissions are all correct but I get this error:

main(): Failed opening 'modules/event/event.theme' for inclusion (include_path='.:/usr/local/lib/php') in /home/coolgolfsite/www/coolgolfsite.com/modules/event/event.module on line 4.

Does anyone have an idea on this?

Thanks for any help.

Jeustace

Watermark on images

Hi!

Just one question, there is any module on drupal to put a watermark on every uploaded image ? For example to put my site URL on it.

Regards
Cristian

does flexinode scale?

What if a site gets many thousands of flexinodes? i could imagine the performance of flexinodes becoming an issue, since retrieving each field requires a subquery of the flexinode_field table?

In practice, does anyone find there are penalties in using flexinodes?

epublish (current edition)

Hi all,

Using e-publish module I have created current edition and listed it on a new page.My question is that is there any facility to upload a picture for current edition .

Bandwidth skyrocketing

Hi,

The bandwidth of my site has increased exponentially as the site grows, both in the number of (stories/nodes) and the number of taxonomy categories.

Suppose I have 1000 stories, 400 categories of which 60% of stories fall within two or more categories. I have menu links pointing to all category items. There are links within 30% of the stories that provide a category menu linking to other related stories (populated by a custom module).

Thus a visiting spider hits the site and loads the same story under multiple different links. Thus the number of unique pages on the site may be 1000, but the number of links for a search engine is more like 1000x10 = 10,000 stories. If I add a seperate taxonomy with the categories commercial / non-commercial the previous count could double to 20,000 pages linked on the site. And so on.

When a spicer hits the site it loads 1000 stories under many categories and subcategories and continues until about 540Mb is consumed for every spider that hits my site. Some spiders are brain-dead and hit my site with 1.87Gb per visit!

I am forced to move to a high bandwidth hosting co. However, there must be another solution since ultimately I am only delaying the crisis until my site gets to 50,000 stories. Searching for bandwidth, I found a lot of posts concerning high bandwidth but little that really addresses this issue. I cannot see how a robots.txt file would help, other than to exclude a spider totally (as I have done with the 1.87Gb culprit).

Pages

Subscribe with RSS Subscribe to RSS - General discussion