I noticed database backups for a D8 site in production started to get incredibly large (before launch, the database was a few MB in size; now it's more than 10 GB—with no substantial change in the amount of content, users, or blocks on the site in that time).

Glancing through the database, I noticed that after launch, the cache_render table has grown enormous—currently 9.6GB and growing—and the only way to trim it's size is to run a cache clear (e.g. drush cr).

It seems others have run into this issue as well:

When I was digging through the table, I noticed a couple of blocks with views exposed forms seemed to take up 100,000+ rows; I'm guessing there's a row for every exposed filter permutation on every page/URL where the block appeared...

Two questions as a result:

  1. Should this cache table ever be cleaned up / garbage collected? (Or do I need to run a cache clear on a regular interval...?)
  2. Is there any way to easily exclude a given block or other renderable object from being cached in the cache_render table?

Comments

geerlingguy created an issue. See original summary.

geerlingguy’s picture

Adding related issue: #1947852: Database cache backend garbage collection does not work with permanent entries — it looks like if we had that feature, we could set items to expire in hours or a day or something rather than never expire.

geerlingguy’s picture

wim leers’s picture

Status: Active » Closed (duplicate)

Doing what #3 says.