I am trying to understand the code for a contributed module. A controller query generates the following print_r output from a loadMultiple statement on the query output.
I needed to sync Commerce product data to an external service and tried the API client approach first — query Drupal's entity system from outside. It broke between stores because every store has different field configurations, entity references, and content types. An API client that works for one store fails on the next.
I want to run PHP unit tests that run on all Multisite sites. But when I run the tests on site “B,” everything uses the default site (“A”) as the database.
Does anyone have an example of how to configure this? Or is it not possible?
I have created an API endpoint in Drupal by configuring a View to fetch audit logs (Drupal 11).
By default, Drupal these APIs do not have built-in rate limiting. In such cases, is rate limiting expected to be configured entirely by the customer and how we can configure these rate limits?
Additionally, is there any documented threshold at which Drupal performance may begin to degrade under high API request volume?
Could you please share any official guidance or best practices regarding rate limiting and performance considerations for Drupal APIs?
Sorry for the subject, but I don't how abstract the subject.
First, I made a mistake: I developed a module called “es_filter_analyser” (with classes, namespace, injection, etc.) and when I uploaded it to Drupal.org, I made a typo and it ended up with the machine_name “es_filter_analyse.”
It took me two months to realize my mistake.
So this morning, I tried to create a new module on drupal.org correctly named “es_filter_analyser,” and I pushed the ISO code to it (modifying the repository in .git/config).