Translation files from ie Drupal Core or Distros are getting bigger and bigger. While import these we run into timeouts.

Some examples are:

The current import processes each file in one go.

Proposed resolution

Each individual file should be imported through the Batch API that is split into chunks.

Remaining tasks

API changes

Original report by Gábor Hojtsy

Given #568996: Knowing the file paths, grab latest translations for projects when installing, we will have bigger .po files of projects. We should fix or .po import to allow seek-based parsing of .po files and have a limit of how long we go into a .po file before we leave and run into the other HTTP request. Currently the .po parsing opens the file itself and cannot be told a seek position. This would go beyond #ldodomination, easing deployment for translations on the greater scale, but it belongs here, since we decided our package format would be one big .po file.


martin_q’s picture

Assigned: Unassigned » martin_q
Gábor Hojtsy’s picture

Status: Active » Needs work
2.26 KB

I'm a "bit" tired but come out of this starter patch. Not sure where to do the seek exactly, depends on how we need to structure this API. The basic idea is that we'd have a seek limit, below which we parse the file at once. If the filesize is above that, we'd extend the API to do seeking:

- return FALSE on error (no file, cannot read, etc)
- return TRUE on success (all file parsed AKA end of file reached)
- return a number (seek position) on partial parsing

So the function can be called back with the seek position and let it continue import from there. We can ideally part from the parsing at any place where we are in the COMMENT context. That is inbetween translations.

This patch totally needs work obviously.

martin_q’s picture

Well, I have a solution which "ought" to work, but I am finding that fseek() is failing (returning -1). Something strange is also happening with the feof() test, and the end result is that only the first 'chunk' of the file gets processed.

Per the second 'Note' on, it appears that fopen() is creating a stream of the kind that cannot use fseek et al. But we're not using http:// or ftp:// format files, are we? The files are being locally stored in sites/default/files/temp and opened from there.

*scratches head*

martin_q’s picture

6.33 KB
martin_q’s picture

5.56 KB

Phew. Well that was many hours work for not much change. But the required functionality is there:

Variables locale_import_po_seek_limit and locale_import_po_chunk_size control what size files get broken up into chunks, and how many lines the chunks (approximately) have. Defaults currently set to 50000 bytes and 100 lines respectively.

At the moment, a while loop processes the iteration through all the chunks, but Gábor states that the intention is for the batch API to handle this.

Gábor Hojtsy’s picture

Project: Drupal core » Localization client
Version: 7.x-dev » 7.x-1.x-dev
Component: language system » Code

Given that this is not going to get into Drupal core anymore (not in D7 that is), moving it to l10n_client. I've posed the question of starting from D6 or D7 first at #568996: Knowing the file paths, grab latest translations for projects when installing, so we'll follow on the version decided and port to the other. I'd imagine the l10n_client becoming a multi-module beast, so you can enable/disable each big chunk of functionality separately.

This unfortunately means that we need to fork this code and have our own version. Which we would needed to have anyway if we go and support D6, so it is not like the sky is falling down on us (hopefully) :)

martin_q’s picture

Assigned: martin_q » Unassigned

OK, I feel out of my depth again (how does one fork??), or at least don't currently feel in a position to contribute to the ongoing development of this, so regret that I must unassign myself at this stage. Hope I can get involved in a more meaningful way in a few months' time.

In the new year I'm going to be focussing very much on developing for D7 so I'd be more interested in getting things to work for that.


Gábor Hojtsy’s picture

Project: Localization client » Localization update
Version: 7.x-1.x-dev »

Moving to newly added l10n_update.

Gábor Hojtsy’s picture

Version: » 6.x-1.x-dev
Issue tags: +localized install

This would be a huge boon to make big imports work on shared hosting :|

Jose Reyero’s picture

This patch needs to be updated for l10n_update project.

I think it should be an easy adition as we are overriding most of anyway ( . Not that I have any idea of how this works though...

Gábor Hojtsy’s picture

The concept is that parsing / importing of a .po file can be stopped anywhere in-between two string translation definitions. That is after an msgstr (for non-plural strings) or right before an msgctxt (or msgid if there was no msgctxt right before). We should be able to import a 10MB .po file in a batch process if we can just remember the last read position in the batch and move forward reading and importing items in the file in multiple HTTP requests. That should mitigate the memory and time limit issues.

The patch is about making _locale_import_read_po() not take a file name and open it at the start and go till the end but instead handle the opening of the file and parsing separately. Remember a seek position and give the opened file handle and the seek pointer to the parser instead of just the file name. The current patch does not yet implement batch use, it was a proof of concept for the API changes required for seek based reading. So it needs to be adapted to l10n_update and moved forward to include batch support.

Gábor Hojtsy’s picture

5.39 KB

Ok, here comes a port of the latest patch to l10n_update. This remains a proof-of-concept patch, but it enables further modification of l10n_update to actually do segmented imports of .po files. Important notes:

1. The seek code change in _l10n_update_locale_import_read_po() is purely demonstrative to show that this in fact works. (I tested it does :). We should always import a .po file in its entirety (without seeking) if we cannot spawn a batch. If we can spawn a batch at any rate, we should break out big .po file imports to a progressive batch operation. This code just demonstrates the seeking in action.

2. The current code reads files bigger then 50000 bytes in approximately 100 lines chunks. When a msgid block is finished and you are above 100 lines, it moves on to the next seek call. Now the code starts line counting from 0 in each seek block, so we really don't have good line numbers for error reporting ATM. The function should have either an internal memory (not sure), or an argument by reference that is used to store the overall line numbers.

So theoretically a batch can now be built around _l10n_update_locale_import_parse_po() that reads and parses a huge .po file in smaller chunks in a progressive batch step. The only issue to overcome here is that _l10n_update_locale_import_parse_po() is this deep in the hierarchy: _l10n_update_locale_import_parse_po() is invoked by _l10n_update_locale_import_read_po() which is invoked by _l10n_update_locale_import_po(), which is invoked by l10n_update_import_file(), which is invoked by l10n_update_source_import(), which is invoked by _l10n_update_batch_import(), which is the batch callback. So we are 5 levels away from that function in the batch process :) And _l10n_update_batch_import() would need to manage the progressive batch, remember the file, line number and seek position for the rerun of the batch step until it completes.

I'm not entirely sure how to overcome this indirection chain and not sure I'd like to replicate all the code for those levels in the batch callback, so we can manage the seek level directly. Neither that we'd pass on all that data through the 5 levels. Any better ideas?

hass’s picture


Gábor Hojtsy’s picture

An actual timeout error report was marked duplicate at #1016982: Upload error.

Gábor Hojtsy’s picture

Another timeout was reported in #1031560: Maximum execution time exceeded and marked as a duplicate of this one. Looks like people experience this issue in the wild.

NPC’s picture

I also get "Maximum execution time of 240 seconds exceeded" error on my localhost site (and it is not a weak PC), when updating the core (this file, or the Ukrainian one).

What surprised me, though, is the value of 240 seconds - where is that taken from? I've set max_execution_time to 1200 everywhere I could (php.ini, .htaccess, settings.php. When I output phpinfo, it shows "local" value of 1200, while "master" value of 60 - so even if the local value is not used for some reason, the master one is not 240.

Should I apply the patch from #12, or is it a different issue? I am using v7.x-1.0-alpha2 of Locale Updater.

NPC’s picture

Ah, looks like it is overridden in, I've set it to higher value and the update process went through ok for me. I guess I can revert all my other attempts to set the max execution time to be very high. Sorry for asking before checking it thoroughly.

Gábor Hojtsy’s picture

Uhm, we should at least not set the time limit lower if it is already higher then what we would like to set it to, right? :) Can you do a quick patch for us (since you have the system ready to test it)? Thanks!

NPC’s picture

Actually, I don't have a system for testing this anymore, since I've updated all the languages.

But I guess the change should be from (line 28 of


To something like (check that the current max execution time minus the time already spent in the script is less than 240, if you want to keep the 4 minutes limit):

if((ini_get('max_execution_time') - getrusage()) < 240) {

But I need to emphasise that I haven't tested this change, for my localhost dev site I just changed the value from 240 to 1200, the dirty hack way.

hass’s picture

On many shared hosts users only have 30 or 60 seconds or their PHP scripts get killed... :-(

Gábor Hojtsy’s picture

Yes, I've not seen a host myself where it took over 60 seconds to import a .po file, but I've heard about them. I fully support this being implemented and tried to get attention to this bug from like-minded people in interest of getting help in resolving it.

NPC’s picture

Update to my comment #19 - turns out that getrusage() is not implemented on Windows platform, so I can't see if this works at all or not (my local testing site is on Win). But from reading deeper one should get a specific item - ["ru_utime.tv_sec"], - from the array returned by getrusage() in order to find out the number of seconds already spent by the script.

Still - not a good method, since it makes the module platform-dependent, so I'd advise against it.

David_Rothstein’s picture

Why was this issue moved out of the core queue? It seems like a bug that should be fixed in Drupal core.

Here's a report of someone running into this problem (while installing Drupal) that came in recently: #1024816: Increase time limit when doing batched .po imports

Gábor Hojtsy’s picture

Version: 6.x-1.x-dev » 7.x-1.x-dev

@David_Rothstein: Because it requires API changes. We've been hard at working on this and related issues in Paris after the Drupalcon, when after the code sprint, code freeze was declared, so we moved off of trying to achieve this in core.

The main problem is that both and take a file object to read the .po file from. The later opens and reads the file until it ends. The problem with big .po files always was that parsing and reading them from beginning to end takes lots of time on dated / loaded servers.

The above discussed approach opens the file outside the function, and passes on a file pointer and a suggested read chunk size to the function. The function would return after it reached that chunk size after reading the file from the given pointer. Then to reuse this in batch processing, we'd need to pass on the remembered seek position at end of last reading and the file name to continue reading in a next batch process. So with this suggested method, both the batches that import the .po file(s) and the .po file import API itself needs a major API change. Will hopefully happen in Drupal 8 once we worked out the kinks in contrib I think. Unless of course breaking APIs is not an issue :)

Gábor Hojtsy’s picture

Also, the + Drupal core only experience is pretty frustrating once you start to add more contributed modules (see, so without l10n_update module, the life of multilingual sites builders will be pretty hard I think. So I was trying to get people use this module even with an integrated install profile, see There are all these great things we did not manage to resolve since was growing up from the ground around the same time Drupal 7 was "frozen" (yes, in Paris).

Gábor Hojtsy’s picture

mansspams’s picture

#19 helped me

klonos’s picture

mandreato’s picture

#19 helped me too...

Thib’s picture


Problem seems to be come from Drupal Core .po file...
When the error occurs it blocks the update of all other modules.
Is it possible to have a new features to choose which module we would like to update translation ?

Thanks a lot, sincerly,


Gábor Hojtsy’s picture

That sounds like a workaround and you'd not be able to update the core translations, still, right? There seems to be two workaroundds for now, since nobody picked up working on the seek import yet. One is to check if we increase the limit or decrease it (even though this is not technically correct, because set_time_limit() restarts the timer too, but it is close):

diff --git a/ b/
index 0d6b0aa..0dd07fe 100644
--- a/
+++ b/
@@ -24,7 +24,9 @@
 function _l10n_update_locale_import_po($file, $langcode, $mode, $group = NULL) {
   // Try to allocate enough time to parse and import the data.
-  drupal_set_time_limit(240);
+  if (ini_get('max_execution_time') < 240) {
+    drupal_set_time_limit(240);
+  }
   // Check if we have the language already in the database.
   if (!db_query("SELECT COUNT(language) FROM {languages} WHERE language = :language", array(':language' => $langcode))->fetchField()) {

(Note that I checked in D7 core, and there are 3 places where the limit is heightened, in drupal_cron_run() in _locale_import_po() (which we mimic) and node_access_rebuild(), and neither checks whether it actually increases or decreases time limits :|)

We could also optionally increase this limit (instead or additionally of checking), to something above 240. Core uses 240 as a standard value at all places, seems to be enough for crons and node access rebuilds, which are pretty expensive.

That still of course does not solve the problem of sites where the limit cannot be increased, and I'm still looking forward for help in driving the above big patch home, which will read and parse the file in chunks and will therefore not hit these limits.

Gábor Hojtsy’s picture

Issue tags: +D8MI

The eventual solution for this needs to filter down to Drupal core improvements in Drupal 8 too. Therefore tagging for Drupal 8 multilingual initiative too. That should not keep focus from solving it first for Drupal 7 and 6 in l10n_update.

Chi’s picture

Is it possible to divide core .ro files generated by ldo package system into several parts? E.g.,, There is relative problem with downloading big .ro files from #1164564: Increase .po file download timeout
I think the main problem is that drupal core .ro file becomes too big. And we are trying to hide the problem by seek based batch import and increasing php (or http) timeouts.

Gábor Hojtsy’s picture

@Chi: what tell you you have a problem *downloading* the file rather than importing? You probably downloaded Drupal itself as well, which is much bigger.

Chi’s picture

I mean drupal core .ro file.
I can download this file manually as well as Drupal itself but there are some troubles when I want to download one with drupal_http_request.

Gábor Hojtsy’s picture

@Chi: you already have an issue at #1164564: Increase .po file download timeout for that, which has a patch you verified works. So looks like that should be fixed with the changes over there, right? It is not an import problem that we are discussing in this issue.

Chi’s picture

I think it's relative issue. That patch fixes only download problem but not import one. Both problem have common cause.
I don't know how works translation package system on ldo but if we can reduce the size of core .ro file we will not need any patches.

Gábor Hojtsy’s picture

Just dropping a note that I've been actively working on this tonight and will hopefully be able to publish a working version over at soon (and rework l10n_update's to work with that module instead). I have a working prototype at the moment reading in the 16 thousand like Hungarian Drupal 7.2 translation in 16 chunks proper. Still need to integrate with the batch API and clean up obviuosly. It is a major rework as to how reading and storing of .po file data is handled.

Gábor Hojtsy’s picture

Assigned: Unassigned » Gábor Hojtsy
Gábor Hojtsy’s picture

Ok, I've committed my initial work for this at, which

A. Abstracts out the parser to work with a reader a writer and an error callback.
B. Implements the reader and the parsing state driver to support batched (chunked) parsing
C. Includes fixes for #655048: Plural formula information blanked when importing a poorly-formed .po file, #522176: fix for unhelpful error message and #545652: Language files import error message need to show full path and provides more helpful errors (file name and line number) when disallowed HTML is detected
D. Overrides the import tab submit function (for now) with its batch based import, so that you can import .po files of almost any size

Here is how it looks when it imports:

And this is the result with the new error messages for disallowed HTML:

It is still not yet integrated with l10n_update, and I think I'll test the waters by trying to integrate it with the module/theme enable batch first.

klonos’s picture

Great work so far Gábor! Thank you.

Gábor Hojtsy’s picture

FYI #1197498: Add support for core batches continuing in that module's queue. Now need to work on API-ifying the locale database interactions themselves (in the spirit of #912252: Build reusable API, add hooks for modules, etc.. and #361597: CRUD API for locale source and locale target strings. This API then can be worked with by l10n_client, l10n_update and i18n. That would open the door to integrate the batch reading from gettextapi with l10n_update, because it would let l10n_update to also handle its own data properties cleanly for the saved strings (+ across all modules like l10n_client and core) without lots of overrides and special code.

wranvaud’s picture

If I understood correctly we should install the gettextapi module and translation update should work, but it has not yet been completely integrated to i18n?

I tried to run an update with gettext and it didn't work.

Gábor Hojtsy’s picture

Yes, the state of this work is that gettextapi already allows you to import .po files of any size for core import forms. It is not yet integrated with l10n_update.

clashar’s picture

I don't have i10_update module, but still can't import Russian po file for Core.
Also got 240 sec error and also tried to change max_execution_time in php.ini

NPC’s picture

@clashar, check with your hosting provider, often altering this parameter via php.ini is not supported (as it loads their CPU). And it is not the same problem being discussed here, if I understand correctly.

pstein’s picture

I have the same 240 seconds problem when loading the into Drupal v7.8
From what I have read so far this could be solved if I convince my webhoster to increase the

max_execution_time in php.ini

Is this correct?

Some of you mentioned above the gettextapi patch. Will this solve the 240 seconds problem as well?
Into which Drupal version will it be integrated?

Thank you

clashar’s picture

Actually I am on localhost, so no problem with CPU should be related and I watched performance, processor wasn't at all loaded.
Also 240 sec, should be really not only i10_update problem.

do you have i10_update module enabled?

clashar’s picture

my error while po importing is:
Fatal error: Maximum execution time of 240 seconds exceeded in Z:\home\\www\emploi\includes\database\ on line 2137

lucabarbetti’s picture

I am a newbie and I had different experiences setting up a test multilingual website (5 lang.) on a brand new Win 7 laptop and a pretty old OS X powerbook. On both platforms the process of adding languages, importing .po files and updating is painfully slow, even though it is not taxing at all for the hardware of both machines.
Especially the Win 7 machine, with its Intel Core i7, fast HD and 4GB Ram, should zip through the whole process instead it seems to 'sleep' (...while my beard gets greyer and greyer!), moreover, I had to modify the php.ini file setting max_execution_time = 1200, otherwise it wouldn't load and import anything.
Even with the much less powerful and old G4 Powerbook it took several hours to add the 5 languages and relative translations, but the process went on smoothly, in fact I didn't need to change the file php.ini (max_execution_time = 30) and I didn't get any error.
On both platform, I used Xampp and Drupal 7 with the same modules.

Gábor Hojtsy’s picture

@lucabarbetti: well, it definitely does depend a lot on your internet connection too. Can you try making it keep all the locally downloaded files (via l10n_update config) and then dropping the l10n_update timestamp data + setting it to use local file only to see how long just local file imports would be. That would be great cross-check to avoid some of the potential failures.

lucabarbetti’s picture

147.44 KB
269.6 KB

@Gábor Hojtsy: first of all I'd like to thank you for all the time and efforts that you put in this open source project, your work and dedication is very important to everyone interested in multilingual websites.
Let's clear the connection issue: downloading all the .po files on my Win 7 laptop takes less then one blink of the eye, while to import from the hard drive only half of one of those file, for only one Text Group, let's say Blocks, takes a ten minute nap. I said half .po file because the importing process stops with a blank screen at roughly two thousand 'terms', which is about 98% conversion, and importing the same file again I get roughly two thousand more terms and another ten minutes nap. By the way, it should be possible to select multiple Text Groups to import the same .po file for all the groups selected.
I tried the language update on my Win 7 laptop (will try the Mac OS X as soon as I can), with only local file selected as you requested, and the result was as follows:
The downlod bar indicated 135 'things' to 'download'; after a while, with the bar not moving and indicating 1/135, the updating process stopped with this error: An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows. Path: /drupal/batch?id=13&op=do StatusText: Internal Server Error ResponseText:
Here are Status reports and Details screens:

lucabarbetti’s picture

Hi Gabor,
while I'm updating translations with I10n_update enabled (via admin-->configuration-->translate interface-->update), wether I have the Update module enabled or not, I see on the screen the following line:
"Importing downloaded translation:"
and the update process stops after a while (An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows. Path: /drupal/batch?render=overlay&id=30&op=do StatusText: Internal Server Error ResponseText:)
I wonder if the last dot in is just the period of the message or the last dot of the URL, the latter would be obviously an error because the file doesn't exist in the server (while does). It might be a stupid or even a desperate (I've been fighting this issue for days) observation but the update process keeps on stopping at the first drupal-7.8.xx.po file, that in my case is the catalan language, and prevent me from updating everything that follows it. I tried to check this out looking in the Update and I10n_update module files but, honestly, I'm not a programmer and couldn't find an answer.
Also, I've noticed that the updating process goes online to the server, and get stuck, even if I check the 'Only local files' option.
Please help!
Thanks a lot,
Luca Barbetti

lucabarbetti’s picture

Well, it seems that I've solved my problem with translations updates setting drupal_set_time_limit() to 0, instead of 240, in file, which is in drupal/sites/all/modules/I10n_update (at least in my drupal setup).
Honestly, I've got to say that I don't know yet which could be the side effects of this change (after all there must be a reason if it was set to 240!).
Anyway... so far so good!
Cheers and happy 4th october to everyone!
Luca Barbetti

wusel’s picture

I got the solution for local XAMPP on win7 at #1240622: PHP Timeout importing translations using xampp on Windows:

Change the setting to innodb_flush_log_at_trx_commit = 2 in \xampp\mysql\bin\my.ini.

clashar’s picture

wusel, I tried your solution but it didn't help me, still got
Fatal error: Maximum execution time of 240 seconds exceeded in ...\includes\database\ on line 2137

clashar’s picture

I have just tried to upload *.po file twice, first time I was getting the error, but on the second time I checked "Existing strings and the plural format are kept, only new strings are added." and I have successfully loaded the translation file.
The parameter "innodb_flush_log_at_trx_commit" I kept original "=1"
I hope there will be no problem, translations seem to be ok.

Sutharsan’s picture

carvalhar’s picture

#55 worked perfectly with my XAMPP on win 7 ;)

wiherek’s picture

worked as a charm on xampp on win xp as well :) isn't it more of a server issue than an operating system issue?

wiherek’s picture

well.. not really. I have another problem now. After changing language, I am getting a 310 error: (net::ERR_TOO_MANY_REDIRECTS)

basically I can't access the page.
clearing cookies, caching via drush, disabling l10n_update and locale doesn't help. I found the module drush_language, but don't know how to install it via drush (the only way I can access my site now). or - change language via drush.

Had to reroll database backup.

Sutharsan’s picture

wahn’s picture

Same timeout issue while importing french translation and #55 fixed it
Thx !

JSCSJSCS’s picture

I got the "240" error when trying to upload a 390kb (yes kb) .csv file using feeds import. #55 worked, but I would rather Drupal fix this and not have to adjust MySQL settings.

Jens Peter’s picture

I am not sure if I have the same issue.
I try to make a batch update of language to Danish.
It say it is importing for Drupal core 7.15 but it return the following error and nothing is updated at all.

An AJAX HTTP request terminated abnormally. Debugging information follows. Path: /batch?id=109&op=do StatusText: ResponseText: ReadyState: 4

In my error log I dont get much more information except it say something like this (but in Danish so I translated it here):
Not valid HTML found. Dash not imported: <Hidden>

Is this the right place to add this or is this a new error I got?
Any help will be very welcome.

hass’s picture

<Hidden> need to be changed to - Hidden -. But it's just only one string that failed to import. Not big harm. But this is not the reason for the AJAX HTTP failure.

Jens Peter’s picture

Thanks Hass
I searched for the text (in correct language) with the but it is not there.
However I can find one text in English that is
I cannot change the original text. Found here: /admin/structure/types/manage/page/display?render=overlay

In my error log I see the following error when I try to import the Drupal core translation manually:

PDOException: SQLSTATE[23000]: Integrity constraint violation: 1062 Duplicate entry 'temporary://drupal-7.15.da_.po' for key 'uri': INSERT INTO {file_managed} (uid, filename, uri, filemime, filesize, status, timestamp) VALUES (:db_insert_placeholder_0, :db_insert_placeholder_1, :db_insert_placeholder_2, :db_insert_placeholder_3, :db_insert_placeholder_4, :db_insert_placeholder_5, :db_insert_placeholder_6); Array ( [:db_insert_placeholder_0] => 1 [:db_insert_placeholder_1] => drupal-7.15.da_.po [:db_insert_placeholder_2] => temporary://drupal-7.15.da_.po [:db_insert_placeholder_3] => application/octet-stream [:db_insert_placeholder_4] => 641899 [:db_insert_placeholder_5] => 0 [:db_insert_placeholder_6] => 1347893476 ) in drupal_write_record() (line 7036 of /var/www/

I have access to the server through FTP but if I replace the Drupal core translation there, it do not seem to pick it up in administration area as it keep say that Drupal 7.15 needs to be updated.

Thanks for any help I get.

Sutharsan’s picture

Status: Needs work » Needs review
5.41 KB

Patch in #12 re-rolled against latest dev. Code from #31 added.

astutonet’s picture

44.27 KB

Using only the patch #31 worked on localhost.

The patch in #68 seems to work, but as in the image display, there is an error at the end of each operation.

At the moment, using only #31.

Any idea?


angybab’s picture

Title: Add support for seek based batch import of .po files » PDOException: SQLSTATE[HY000]: General error: 2006 MySQL server has gone away: drupal 7 beta3
Assigned: Gábor Hojtsy » angybab
Category: task » support

I am very new to Drupal and to programming, after installing some e-Commerce modules I got this error message. I tried to delete the modules one by one. This did work, but I need them to create a mini shop. I increased the memory size to 256MB, maximum execution time to 120s. Change the mysql file my.ini (maximum_allowed_packed from 1 M to 32M. Now I get a blackout. Please any one with a solution? I will be very grateful.

Uncaught exception thrown in session handler.
PDOException: SQLSTATE[HY000]: General error: 2006 MySQL server has gone away in _drupal_session_write() (line 209 of C:\wamp\www\Dacsmarketing1\includes\

Uncaught exception thrown in shutdown function.
PDOException: SQLSTATE[HY000]: General error: 2006 MySQL server has gone away: DELETE FROM {semaphore} WHERE (value = :db_condition_placeholder_0) ; Array ( [:db_condition_placeholder_0] => 14328302745124b47c8d5b06.37258940 ) in lock_release_all() (line 269 of C:\wamp\www\Dacsmarketing1\includes\


( ! ) SCREAM: Error suppression ignored for
( ! ) Fatal error: Uncaught exception 'PDOException' with message 'SQLSTATE[HY000]: General error: 2006 MySQL server has gone away' in C:\wamp\www\Dacsmarketing1\includes\database\ on line 2139
( ! ) PDOException: SQLSTATE[HY000]: General error: 2006 MySQL server has gone away in C:\wamp\www\Dacsmarketing1\includes\database\ on line 2139
Call Stack
# Time Memory Function Location
1 15.1829 26795424 DrupalCacheArray->__destruct( ) ..\
2 15.1829 26795984 DrupalCacheArray->set( ) ..\
3 15.1829 26796160 lock_acquire( ) ..\
4 15.1835 26809600 lock_may_be_available( ) ..\
5 15.1835 26809944 db_query( ) ..\
6 15.1835 26810168 DatabaseConnection->query( ) ..\

Sutharsan’s picture

Title: PDOException: SQLSTATE[HY000]: General error: 2006 MySQL server has gone away: drupal 7 beta3 » Add support for seek based batch import of .po files
Assigned: angybab » Sutharsan
Category: support » task

@angybab, assigning to yourself does mean you are going to work on it (I will). Change the title to match the (changed) content of the issue; changing back.

MrHaroldA’s picture

We were running into timeouts too, until we changed this in the Mysql settings:


This reduces imports from minutes to seconds!

More info:

plach’s picture

The attached patch implements the quick fix suggested by Gabor in #31 both on core and l10n_update. Please disregard this, unless you need a quick and dirty fix, and test/review #68 instead.

trainingcity’s picture

Tried patch in #68. No joy :-(

Also tried changing my max_execution_time to 1800 in my php.ini file (private server, this is the correct setting for me). Still no luck.

Any other suggestions welcome.

MrHaroldA’s picture

@trainingcity: did you try my suggestion in #72?

trainingcity’s picture

HI MrHaroldA. I am running mysql with myisam tables. No good reason for this, I need to convert to innodb before the site launches. I'm not much of a mysql expert, is there an equivalent command for my case?

dordic’s picture

Version: 7.x-1.x-dev » 7.x-1.0-beta3

For those, who just need to get things working:

1) Split up large (> 600kn) .po files in two parts - for instance the .pos of drupal commerce kickstart or drupal core - keep the headers
2) Import those files manually (option checked: "Existing strings and the plural format are kept, only new strings are added.")
3) Now you can run Localization Update for the rest of the modules

steinmb’s picture

Version: 7.x-1.0-beta3 » 7.x-1.x-dev
Sutharsan’s picture

Assigned: Sutharsan » Unassigned
Drunoober’s picture

Hi guys... can I have an update please if possible.

I feel stuck since in my ignorance I believe that until this patch not integrated fully into the module I will still have issues when trying to convert my English kickstart 2 site into Italian :( untill then I feel obliged to use other CMS solutions or Drupal 6 with UC which I rather not do.

thanks for the help.

Sutharsan’s picture

@Dunoober, the update is that there is no update :(
Speaking for myself all my resources are taken by migration to Drupal 8 (core) and by improving this functionality in Drupal 8. And I expect that the same goes for the other two maintainers of this module. But no reason to panic, there is an alternative. You can manually import translations into you site (Translate interface; admin/config/regional/translate/import). Translations can be found at the Drupal translation server:

Sutharsan’s picture

Issue summary: View changes

Initial stab to add Issue summary

JvE’s picture

Issue summary: View changes
Status: Needs review » Needs work

I have also being bitten by this issue.

A couple of points:

  1. if (ini_get('max_execution_time') < 240) {
    This does not take into account that max_execution_time '0' means unlimited and will decrease the limit.
  2. afaik when running a site install through drush the seek support won't help since everything is in 1 request

My workaround is to add this to my drush.ini:
disable_functions = set_time_limit
This will prevent any code from messing with the max_execution time that I set (0 for unlimited).

Sutharsan’s picture

Instead of improving this patch, I decided to create a backport the Drupal 8 code. The Drupal 8 translation import code was written with the use case of this issue in mind. If you experience any problems with large files, please try the 7.x-2.x-dev release. Do not use it in production environments yet. But any feedback on this branch (create new issue) is very welcome!

Once the 7.x-2.x branch lifts off, this issue will be closed as "Won't fix".

Sutharsan’s picture

Status: Needs work » Closed (won't fix)

Closing this issue as 7.x-2.x is functioning.

If you experience problems with translation updates for large number of modules and/or languages, try the 7.x-2.x-dev release of this module.

klonos’s picture

@Sutharsan we seem to be having the same issue in Backdrop CMS. Can you please point to the code from the D8 backport? That would be of great help. Thank you in advance.

klonos’s picture