Problem/Motivation
Angular's resource() API (introduced in Angular 19) and the signals-based reactivity model make AI-powered components a natural fit for the Angular ecosystem — streaming responses, structured output, and reactive UI state without manual subscription management. However, there is currently no established pattern for wiring this into a Drupal contrib context: where does the LLM call live, how are API keys protected, how does the component receive its prompt context from Drupal, and how does this compose with PDB's existing component discovery model?
pdb_angular_entity already solves the Drupal-to-Angular data bridge (entity fields → drupalSettings → Angular @Input()). The same bridge can carry prompt context, configuration, and per-instance AI settings to Angular components that use resource() for LLM calls. A sub-module is the right scope — it keeps the AI layer optional, separately versioned, and out of the core entity display feature set.
Proposed resolution
Introduce pdb_angular_entity_ai as a sub-module of pdb_angular_entity. The sub-module would provide:
- A Drupal PHP controller acting as an LLM proxy — receives requests from Angular components, forwards them to a configured LLM provider (e.g. Gemini, OpenAI), and streams the response back via Server-Sent Events. Keeps API keys server-side. Drupal's access system controls who can call the endpoint.
- Angular components using
resource()for streaming — consume the Drupal proxy endpoint. Reactive state managed via signals: loading, streaming chunks, final output, error. No manual RxJS subscription management. - Per-instance configuration via
drupalSettings— the existingpdb_angular_entitybridge carries prompt context, model configuration, and feature flags into each component instance. Components stay stateless and reusable. - At least one reference component — a minimal working example (e.g. an entity-aware content summariser or a site-scoped chat widget) that demonstrates the full stack end-to-end.
The sub-module depends on pdb_angular_entity and follows the same presentation: angular component discovery model — no new discovery mechanism needed.
Remaining tasks
- Resolve the LLM provider strategy — direct Gemini/OpenAI integration vs. delegating to the Drupal AI module for provider abstraction. The Drupal AI module approach is preferred if it covers streaming; needs investigation.
- Confirm that Angular's
resource()API handles Server-Sent Events streaming cleanly, or whether a customhttpResource()pattern is needed - Decide on Genkit — useful for agentic tool calling and structured output, but adds a Node.js build-time dependency that may not fit a Drupal contrib context. May be out of scope for v1.
- Define the sub-module's config schema — LLM provider settings, API endpoint path, per-component model overrides
- Scaffold the sub-module directory structure and Angular workspace integration alongside the existing
ng_component/workspace/ - Community input on the PHP proxy approach vs. alternative patterns
User interface changes
New admin settings form for pdb_angular_entity_ai — LLM provider selection, API endpoint configuration, global model defaults. Per-component AI settings passed via the existing pdb_angular_entity component .info.yml configuration: key.
API changes
New Drupal route: /pdb-angular-entity/ai/stream (or similar) — POST endpoint accepting prompt + context, returning an SSE stream. New Angular service wrapping resource() for consumption by AI components. No changes to existing pdb_angular_entity APIs.
Data model changes
New config object pdb_angular_entity_ai.settings — LLM provider, API key (stored via Drupal's key management or environment variable), default model, streaming enabled flag. No database schema changes.
Comments