Problem/Motivation

Every page view fires a POST to the emit endpoint, even when the same visitor views the same content repeatedly. On high-traffic sites this generates significant unnecessary load. A user refreshing a page or browsing back and forth will emit incidents for the same entity every time.

The D7 version of this module had flood protection via a radioactivity_flood_map database table, but that still required a server round-trip for every page view to check the map.

Proposed resolution

Client-side flood protection using localStorage. Before firing the emit POST, the JavaScript checks whether the same entity was already emitted within a configurable time window. If so, the emit is skipped entirely — no request reaches the server.

This would be a per-field formatter setting (flood_interval, in seconds, default 0 = disabled) alongside the existing energy setting, following the same pattern as the incident sampling ratio in #3032540: Configure sample percentage to reduce server impact.

  1. In triggers.js: Before emitting, check localStorage for a key like ra_flood_{entityType}_{entityId}. If it exists and hasn't expired, skip. Otherwise set the key with the current timestamp.
  2. Add flood_interval to formatter settings, pass to drupalSettings.

No server-side changes needed. No changes to Incident.php, the emit endpoint, or the storage layer.

Trade-offs
localStorage is per-browser, clearing storage or switching browsers resets the flood window. I feel this is acceptable for popularity measurement where the goal is reducing noise, not enforcing strict per-user limits.

I can work on this if the maintainers think it's a good idea.

Comments

loze created an issue. See original summary.

loze’s picture

Issue summary: View changes