Problem/Motivation
Opening this as a child of #2493035: Discover any memory limit issues we have.
When submitting the modules page (no xdebug or xhprof), I see 70M of memory usage. We've just set core's minimum memory limit to 64M after lots of discussion (See #2289201-66: [Meta] Make drupal install and run within reasonable php memory limits so we can reset the memory requirements to lower levels, so opening this as a critical bug.
Proposed resolution
The immediate worst offender I found was field_help(). This takes 800ms wall time on my machine, and 8mb of memory.
In ModuleListForm::buildRow() we check if the module has a hook_help() for the main help path - by executing hook_help() with that path. For field_help() this means listing every field type, widget and formatter - just to render a link. Needs an issue - we can probably move the actual listing to a dedicated route then link to it?
Other bad offenders:
views_theme() - loads lots of plugins. Not a new problem particularly - theme registry rebuilds have always been bad, but grrr. Worth an issue in case we can come up with something.
fileld_system_info_alter() 3.8M - should already have an issue.
filter_system_info_alter() 3M - should already have an issue.
Remaining tasks
Get the child issues RTBC and committed, then check the memory usage again.
User interface changes
Not here.
API changes
Not here.
Comments
Comment #1
catchComment #2
catchComment #3
Wim LeersComment #4
larowlanIs it worth making simpletest emit a peak memory usage header in the child site and then have the runner collect it and fail the test if it exceeds 64mb, or even a lower value to add a buffer? We do something similar on client projects as part of our CI against a fixed list of pages.
Comment #5
Wim Leers#4: that sounds splendid and very clever.
Comment #6
catchThat sounds good to me too. We'll need an opt-out for things like the simpletest self-test which are known to take loads of memory and which we're not interested in keeping under very much.
Comment #7
larowlan#2495411: Make simpletest fail a test when it detects pages that need more than 64MB
Comment #8
YesCT CreditAttribution: YesCT commentedComment #9
YesCT CreditAttribution: YesCT commentedI missed the issue where "We've just set core's minimum memory limit to 64M after lots of discussion, so opening this as a critical bug."
Where did we do that?
System requirements for Drupal can be found at http://drupal.org/requirements
and says "Drupal 8 core requirements are in flux, but setting to 128 MB should be enough. See issue. "
Comment #10
catch@YesCT see #2289201: [Meta] Make drupal install and run within reasonable php memory limits so we can reset the memory requirements to lower levels.
Comment #11
YesCT CreditAttribution: YesCT commentedah, I see. updating the issue summary.
Comment #12
catchComment #13
xjmI updated https://www.drupal.org/requirements/php#memory based on #2289201: [Meta] Make drupal install and run within reasonable php memory limits so we can reset the memory requirements to lower levels; that got missed before.
Comment #14
catchAdding #1387438: Timeout on enabling modules: make it a batch operation as a child issue.
Comment #15
catchEasy 500kb-ish: #2502373: Don't rebuild the schema on module install.
Comment #16
catchRe-tested after #2497017: Views::getApplicableViews() initializes displays during route rebuilding etc. (opcache disabled)
(to reproduce, install standard profile, enable actions module).
Tantalizingly close:
HEAD:
Submit: 65.73 MB
Form render: 52.96 MB
With #2392293: Refactor hook_system_info_alter implementations to use ModuleUninstallValidatorInterface applied:
Submit: 61.65 MB
Form render: 49.9 MB
That only just gets us under the memory limit, and #2495411: Make simpletest fail a test when it detects pages that need more than 64MB has flagged up we may have other pages with problems, but nice to see progress.
Comment #17
catchJust lost my form submit to a d.o 504 timeout.
#2392293: Refactor hook_system_info_alter implementations to use ModuleUninstallValidatorInterface and #2494989: Don't render main help pages on modules page just to generate help links - can lead to high memory usage on form submit both committed today.
After those enabling actions module with no opcode cache is down to 59.4mb for the form submit, and 48mb for the subsequent page render.
Enabling all modules is still around 80mb, but
1. #1387438: Timeout on enabling modules: make it a batch operation is the issue to fix that, there's no reason not to do this in a batch.
2. For some reason, re-submitting the form succeeds - this might be the YAML file cache being primed by the first attempt, but I didn't look into it in depth. Either way it's a very different problem to a single module enable failing.
Given that, I'm going to downgrade this to major task. There are still some sub-issues here that are worth doing, but happy with 11mb saving here since opening the issue a couple of weeks ago.
Comment #18
catch#1003788: PostgreSQL: PDOException:Invalid text representation when attempting to load an entity with a string or non-scalar ID introduced quite a large performance regression on cache misses as far as I can tell.
Loading an entity requires getting all field definitions for the entity type - this can end up taking 3.5mb when installing a module for example.
That change was mainly for postgres so we might want to limit the casting to when postgres is in use.
Comment #19
catch#19#18 isn't quite right. Getting the field definitions is expensive, but cleanIds() is just one of several places that gets them.Comment #20
dawehnerA self-referencing comment, aaah!
Comment #33
catch#2495087: comment_entity_storage_load() is too expensive on cold caches is still open but we don't need this meta to track that.
#3257725: [PP-1] Add a cache prewarm API and use it to distribute cache rebuilds after cache clears / during stampedes is new and attacks a lot of the same ground in a different way.