Problem/Motivation

For background on progressively decoupling Drupal during 8.x, read Dries' blog post and #2645250: [META] Supersede Backbone in core admin UIs with a new client-side framework. In this issue, however, I'd like to explore what Drupal 9 might look like and how to get there.

In my opinion, Nicholas C. Zakas' post from 2013, Node.js and the new web front-end, is exactly correct in identifying that much of the web's (including Drupal's) current architecture of JS for client-side UI and PHP for server-side UI plus business logic (diagram) is not conducive to building great UIs, because it splits the UI code of a single website into two languages and results in barriers to front-end developers "owning" the server-side portion.

Proposed resolution

That blog post proposes that a better architecture is for client-side JS UI code to communicate over HTTP to server-side JS UI code and for that in turn to communicate over HTTP to PHP business logic code (diagram). I agree with that proposal, and furthermore, propose that it's time for Drupal to migrate to that architecture, which means:

  1. Make Node.js a hosting requirement for Drupal 9. It's already the case that you can find cheap (e.g., $5/mo) Node.js hosting plans (https://www.openshift.com/ even has some free ones), and I think the options will significantly increase in the 3 or so years between now and Drupal 9's release, especially with Wordpress's recent release of Calypso, which might result in a market of millions of small website owners out there for mass hosting providers to want to attract.
  2. Pick a JS framework to start rewriting some core UI code with. See #2645250: [META] Supersede Backbone in core admin UIs with a new client-side framework. Note that it's entirely possible that whatever is picked in 2016 won't end up being the one we want to use for Drupal 9's release. That's ok. Just like during Drupal 8's development, we started WYSIWYG development with Aloha editor, then switched to CKEditor once that became better suited to Drupal's needs. But we need to pick something to start the process with.
  3. #2608062: [policy, no patch] Pre-requisites for opening Drupal 9 proposes to minimize the delta between 9.0 and 8.LAST, perhaps going as far as making 8.LAST contain everything that 9.0 contains such that 9.0 is solely the removal of BC layers from 8.LAST, much like Symfony 3.0/2.8. Therefore, if we want to apply that approach to this issue, we'd need to figure out how to get incremental pieces into minor version of 8 that are optional. That would also allow people with Node.js on their server to start using the Node.js-driven UIs within their 8.x sites. And then in 9.0.0, we can remove the old PHP-based UIs.
  4. #1804488: [meta] Introduce a Theme Component Library, #1843798: [meta] Refactor Render API to be OO, and #1447712: Evaluate Symfony form component as a replacement for Drupal FAPI are issues about modernizing things in the theme, render, and form systems that we didn't get to for 8.0.0. What I'm basically suggesting in this proposal is that we do that modernizing in Node.js rather than in PHP.

Remaining tasks

  • Discuss this proposal to see if it has sufficient community buy-in.
  • If it does, create an initiative to organize this huge undertaking, similar to Drupal 8's Configuration Management and other initiatives.

User interface changes

TBD

API changes

TBD

Data model changes

TBD

Files: 
CommentFileSizeAuthor
#77 wordpress_calypso.png187.51 KBcorbacho

Comments

effulgentsia created an issue. See original summary.

Wim Leers’s picture

I'll ask the question that I think will be on many people's minds: Will this effectively mean rewriting Drupal in JS?


The issue title specifically says UI layer. So, at first sight, the answer is "no".

But rewriting UIs in JS implies that some subsystems/components will have to be rewritten in JS also. For example UIs showing formatted text (formatted/filtered by the filter system) will want to allow for live previews (when writing a blog post or a comment). Which makes rewriting the filter system in JS a hard requirement:

  1. remaining in PHP requires round trips to the server (slow)
  2. requiring everything to be written in both PHP and JS in exactly the same way is both brittle and painful.

This then means PHP code using the filter system would have to call JS code (once the filter system is rewritten in JS). And likely there are more examples like this.

In fact, the issue summary already explicitly says this (but doesn't talk about the consequences):

[…] are issues about modernizing things in the theme, render, and form systems that we didn't get to for 8.0.0. What I'm basically suggesting in this proposal is that we do that modernizing in Node.js rather than in PHP.

This is already saying that the render system and form system would have to be rewritten in JS. Form validation and submission must happen on the server side. In other words: not just a decoupled UI layer would need to be written in JS, but also at least some of the back end.

Consequences:

  1. the code outside of the UI layer (the back end) will contain both PHP and JS
  2. it will be both logical and tempting to move ever more components to JS

Where/how will we draw the line?

My conclusion: without clear boundaries, the answer to the question will become "yes".


The answer to the question Where/how will we draw the line? that we will eventually arrive at will be the mechanism by/boundary at which we decouple. Perhaps it will be GraphQL. Perhaps it will be something else.

catch’s picture

Wim's point was also one of my first thoughts when effulgentsia showed me an early version of this idea.

For example forms - there are two parts to form API, the form callback that creates the form array, and the form rendering/submission subsystem that takes that definition and handles everything else. At what point do you move to js?

- from a form display object via entities? So just send the form configuration over, but widgets etc. then have to be 100% js + REST then.
- from a static config object or entity? Same problem.
- from a JSON representation of a form array?

If it's either of the first two, then there's a lot of information to communicate and logic to rewrite (form element descriptions, multistep etc. all have to be implemented in js, widgets would need to be in js then call back to REST for default/allowed values etc).

If it's the latter, then there's potentially less to implement in js, but it's not going to replace 'form API' as such so there's more chance for duplication.

If we want to allow 100% decoupled Drupal, then we'll need to provide a REST API for the first categories though - i.e. to enable someone to build a proper UI based on entities and fields, form display modes need to be exposed via REST, otherwise you lose the ability to configure the form at both ends. For me, that seems like something we'd want to enable anyway, regardless of where core eventually settles.

Wim Leers’s picture

If we want to allow 100% decoupled Drupal, then we'll need to provide a REST API for […]

Just REST won't be sufficient. A decoupled front end (UI) communicating with the back end (content & config repository) requires far too many requests (and will hence be slow). Unless you create specialized REST endpoints for every bit of the UI. At which point GraphQL (or something like it) becomes not only appealing, but crucial.

Crell’s picture

There's two distinct pieces here, which I think we need to separate as I feel very strongly differently about them.

Part 1, I think, is what's being touched on in #2645250: [META] Supersede Backbone in core admin UIs with a new client-side framework. That is, we want a more interactive, responsive, performant UI with more of a Single-Page-App feel for admins, because that's what users now expect from Facebook et al (rightly or wrongly). On the whole, I agree in concept; aggressively *progressively enhancing* our admin experience to require fewer full page loads and give faster feedback and so forth I fully support, especially if we can do it in ways that smoothly-enhance. Vis, I disagree with the implication in Dries comment in #2645250-14: [META] Supersede Backbone in core admin UIs with a new client-side framework that an SPA-only-admin is acceptable. If for no other reason than it would torpedo a lot of the accessibility work we've done. Good accessibility is not impossible with an all-JS approach, but it's certainly much harder. It also makes proper testing more difficult, and when it breaks it is likely to break much more catastrophically.

Still, there's good reason to put a lot more effort into the smoothness and polish of the admin experience, and even potentially things like comments. (Side note: Does anyone use comments anymore, other than d.o and personal blogs? I don't think any of my clients have used it in about 7 years.)

Part 2, though, suggests that we need Node.js for that. On that point, I cannot disagree more strongly. I'm sympathetic toward the "only one language to learn" argument, and have toyed with tools like Meteor.js that fully couple the front and back ends in JS, but I find it overblown. For one thing, unless all of Drupal gets rewritten in JS (pleasegodno) you'll still need to know two languages to work with Drupal. Right now a client developer can get by with knowing PHP and a sprinkling of JS. Most of our JS is controlled from the servers side via the Ajax API, and so we can do quite a bit without needing to write JS ourselves. Requiring the use of Node.js not only increases the barrier to entry in terms of knowledge (as you now need to get really good at a very-different language to PHP in order to work on the admin) but in terms of tooling (just how many package managers does JS have now?). And even with that, the proposal would add an extra HTTP hop (from Node.js to PHP), which would harm the very performance we're talking about.

Moreover, Node.js is entirely unnecessary in this day and age to get that kind of responsiveness! What's needed isn't Javascript; what's needed is websockets and/or HTTP 2 instead of ancient CGI. There's no need to step out of PHP to do that, and browser support at this point is surprisingly good (and will be universal by the time D9 is actually a thing).

In fact, I gave an entire presentation at DrupalCon Barcelona last fall on this *exact* topic: https://events.drupal.org/barcelona2015/sessions/drupal-2020

PHP already has multiple tools working on persistent-connection, persistent-daemon, really-really-fast async services, exactly of the kind we're talking about here. ReactPHP is already the de facto standard way to do websockets in PHP, and is architecturally similar to Node.js (and is just as fast). Icicle.io uses generators and coroutines to provide a much nicer user experience. Forking is still a viable thing. HHVM has async primitives in the language, and there's talk of PHP 7.x getting them at some point. PHP FIG has been putting together working groups to standardize promises and event loops, two of the key elements of any such system, led by the very people building React, Icicle, etc.

PHP can do this. And we then do not need to reimplement any parts of Drupal's server-side in JS. *Anything* we have that is thread-safe can be used by the websocket side just as easily, without any reworking. We just need a parallel harness for the same underlying code. And we can continue to work with the rest of the PHP community and help drive the platform itself forward. It also leaves open the potential for a future Drupal to be running mostly as an async daemon if that's what makes sense longer term; because it's all clean PHP, we can move back and forth between the REST side (core today) and an async/websocket side (React, Icicle, or similar) quite freely depending on what we decide works out best, and if we do it really right individual sites could even adjust that balance. (Likely high-end sites only, but still on the table.)

The one caveat to all of that is the phrase "thread-safe". Not all PHP code that exists will run safely in a shared environment. The code that can do so is... stateless, decoupled, cleanly injected services and value objects that follow functional principles. Exactly what we've just spent 5 years pushing toward! We're not entirely there yet, but we're much much closer to it. And going the rest of the way is something that has benefit even without talking about async, that's just an added benefit. (Or async is the point and vastly improved testability and grokability are the added benefit. Take your pick.)

tl;dr: Big +1 to giving Drupal async/websocket support and using that to beef up the admin experience, among other things. Massive -1 to using Node.js to do it. Using PHP for that is not only possible, but a far superior approach in almost every way.

phenaproxima’s picture

Crell, you have perfectly articulated my own feelings about this.

I would rather not see Drupal depend on Node.js. I imagine that such a seismic change would cause a lot of wheel-reinvention and alienate many members of the community (from what I've heard, there are a lot of folks who really hate JavaScript).

The main benefit of Node, as far as I can tell, is its asynchronous event loop. If this is already doable in PHP (and support will only improve with time), that's a pretty strong
argument to stick with it.

As Crell points out, the main limitation of Drupal's UI is that it's built on the old CGI request-response paradigm, rather than continuous low-latency bidirectional communication between client and server. That's what we need in order for Drupal to feel fast and responsive. If we take that approach, I believe we could introduce it piecemeal, possibly even in the 8.x cycle (by adding support for async and Websockets to Drupal's existing JS layer), which might allow some best practices to emerge.

+1 for async and Websockets. -1 for Node.

mdrummond’s picture

+1 zillion to to crell's proposal in #6.

If our goal is a snappier interface, I would much rather do so taking advantage of emerging technologies that can used within our existing stack rather than rewriting our theme system for questionable benefit.

I think it's great the Drupal 8 makes it easier to create decoupled sites. I think that's a far cry from deciding that every single Drupal site would be decoupled, which is the essence of the NodeJS proposal.

Stepping back, another huge point to keep in mind is that we just went through a huge round of decision paralysis around Drupal while we waited for Drupal 8. Clients understandably were uncertain about moving forward with Drupal projects when such significant changes were on the horizon. Moving towards a complete rewrite of the theme system in JS would reintroduce exactly that sort of quandary that could paralyze clients moving forward with Drupal projects.

While the suggestions crell mentions would also involve a lot of new technologies, they would be in keeping with the current direction of Drupal.

podarok’s picture

+1 websockets, -1 NodeJs
Sorry for missed session in Barcelona, but the comment makes lot of sense and totally exposes my thoughts.

pwolanin’s picture

It sounds like either approach is going to be a major initiative and need a team that's willing to experiment and work in this for an extended time.

At first read, I would certainly also favor using a long-running PHP process over rewriting all the things in JS.

rlmumford’s picture

Again, +1 to crell's proposal.

What I would like to see is a persistent bootstrapped drupal with websockets.

yched’s picture

2 cents :

I've been playing with Icicle after seeing @Crell's Barcelona session linked above, and I totally dug it. The idea of having a Drupal being able to serve HTTP requests and websockets in a long-running, event-loop-based process is sure exciting (and also quite daunting - we currently have static cache patterns all over the place, in a "whatever, this will get cleared at the end of the current request anyway" way of thinking; also, as @Crell points out, the Entity / TypedData system is quite stateful at the moment)

But moving that way still leaves the issue of split templating languages/syntax between decoupled front-end parts and non-decoupled back end parts, with the "duplicate templates" major drawback it entails.
I can't say I'm a huge believer that a Twig.js approach would actually buy us much, given the amount of drupal-specific lingo we put in our Twig. Also, the real logic of a theme hook often lies in the preprocess (PHP) code as much as in the template, so being able to use a common template file in both server PHP and the browser JS doesn't really solve much IMO.

I'm not sure both approaches are exclusive though. Working towards being able to run the PHP parts of drupal in a long-running process that serves HTTP requests and websockets is one thing. Working towards moving the HTML generation to a Node process is another thing.

moshe weitzman’s picture

I'm not sure both approaches are exclusive though. Working towards being able to run the PHP parts of drupal in a long-running process that serves HTTP requests and websockets is one thing. Working towards moving the HTML generation to a Node process is another thing.

I think the issue is that we the Drupal ecosystem only has a limited amount of sponsorship money. We need to spend wisely. It would be ideal if these efforts could proceed unfunded but D8 experience suggests that money is needed to fund major projects like this.

I think that we need a wiki which better details Daemon Drupal and especially lists the challenges involved. A prototype, even a hackish one, would help illustrate the challenges. There's great example code in Crell's presentation but I'm thinking of something more Drupalish.

Crell’s picture

yched: While client-side templating is all well and good, re-rendering a block, or a view mode of a node, and then pushing just that string back through a websocket keeps the render logic server-side but is still going to be far faster than we can do things now. In fact, that's what our Ajax API *already does*, over HTTP 1.1. Simply improving that (using a web socket and persistent daemon so that we don't have a full bootstrap, adding some more robust API commands...) could still get us a big performance improvement for SPA-ish behavior with far less invasive changes.

Edit: Fix embarrassing typo.

dawehner’s picture

One of the most demanding issues in terms of time have not been the technology in the D8 cycle, but rather teaching the community for example about proper OOP. When we would go with experimenting with nodeJS this education step would be most probably by far larger than rewriting parts of Drupal in non blocking PHP.

More important though a lot of concepts like stateless are shared between both solutions, so going with a PHP based solution doesn't make the serverside JS impossible, well, it even makes it more feasible on the longrun, given that the community learns concepts, and applies it to Drupal. Once parts of Drupal are usable outside our old world, going to JS would be actually doable.

Jaesin’s picture

+1 websockets, -1 NodeJs

Crell++
dawehner++

To dawehner's point, I've seen allot of module ports that are still mostly procedural. It's taking some contributers a bit of time to come up to speed on D8 design patterns. It's understandable since they are dealing with a full time job supporting D7 sites.

Even though allot of folks seem to think server side JS is a new thing, It's been around for a long time. Evolution of Server-Side Java Script. IIS+JScript, Remember that? There have always been performance issues.

Dealing with tuning PHP, D8, vendor libraries, node, node libraries seems like allot to ask from a non-profit client.

Netflix Example

Sounds like a nightmare.

Jose Reyero’s picture

I suggest we also rename Drupal to Jrupal. But really, if we are going to rewrite Drupal, I'd rather go for Python.

Now seriously -though I am serious about Python- I think there are very interesting ideas in this thread and while I'm no fan or server-side-js -nor of PHP anyway- I'd like to see something like this working.

But... The word that rings a bell is always, "replace", "rewrite". My question: Why can't it be a contrib module that works side by side with Drupal core?

I know it is always easier just to "get it into core" but really, these could be separate components that can be in a contrib module -even if the module replaces core functionality-.

So, if it can be contrib first, it should be. Then it may be an easier sell for Drupal 9. But really if I haven't seen this thing working before, and being used for something useful, I would strongly oppose this 'interesting idea' replacing any of Drupal's already working stuff.

-1 to "core initiative" or core-anything, +1 to contrib-anything

bojanz’s picture

There's no doubt that there's a lot of value in using JS on the server side. The node people realized it years ago.
But throwing away a large (majority?) percentage of Drupal in favor of JS code means that it's not Drupal anymore.
Thus, such a thing needs to be done in a separate project (with the added benefit of not having Drupal's slowness in decision making and execution).

mdrummond’s picture

#13:

I think the issue is that we only have a limited amount of sponsorship money. We need to spend wisely. It would be ideal if these efforts could proceed unfunded but D8 experience suggests that money is needed to fund major projects like this.

I am not aware of any post-D8 launch fund for this sort of initiative, so can I assume this means Acquia funding? So far the proposals I've seen ranging from "let's start experimenting with having a JS framework in core" to "let's replace the theming system and maybe more with JS" have all come from people associated with Acquia. Not that others might not also be interested in that, but the actual proposals seem to be starting with Acquia. So is what's being said implicitly that Acquia is going to fund developers to work on all or many of the issues regarding transforming some or all aspects of the theme system? (Obviously with the hope that others would be interested in joining in, but there seems to be a lot of community pushback so far, so who knows how many would be interested in helping with that.) I assume it would not be difficult to commit those JS issues given the numbers of core committers at Acquia.

I want to be very clear: I am extremely appreciative of all that Acquia gives to the Drupal community. We could not be doing nearly so much without Acquia's support.

I ask because I think others are also wondering if these proposals are going to move forward at the direction of the office of the CTO regardless of community consensus, or if this is genuinely along the lines of "Well this is something we could do, but only if the community agrees this is the direction we want to head." I hope it is the latter.

dawehner’s picture

Thus, such a thing needs to be done in a separate project (with the added benefit of not having Drupal's slowness in decision making and execution).

An alternative approach would be to realize that Drupal is not the CMS, but rather the community. Given that, the community Drupal could produce several PHP libraries, a CMS, as well as some serverside rendering in JS.

bojanz’s picture

@dawehner
True, that's a good point.

catch’s picture

Just REST won't be sufficient. A decoupled front end (UI) communicating with the back end (content & config repository) requires far too many requests (and will hence be slow). Unless you create specialized REST endpoints for every bit of the UI. At which point GraphQL (or something like it) becomes not only appealing, but crucial.

@Wim, so this bit I think is worth working on - proper REST endpoints that are viable for a fully functioning admin, and GraphQL to reduce the need for HTTP requests. Anything we do for those is immediately useful regardless of what a specific core implementation might end up looking like.

Crell’s picture

In concept at least, adding CRUD support for config objects serialized to JSON is not hard at all. That may or may not be sufficient for what we want to do here, but at the infrastructure level it's totally possible today if someone wants to put in a few days worth of work.

(I'm talking about basic GET/POST/PUT/DELETE here. I don't know how we'd manage hypermedia links for config objects, but since we don't do hypermedia links that well on content entities either yet that's not a major blocker.)

Performance would be better for some things if done over a websocket, but that comes down to a case-by-case question of whether that's even needed.

Core question: What is the problem being solved? Certainly we can do more things with ReactPHP or Icicle on the server-side, and I am fully in favor of getting Drupal websocket-ified, but what features do we want to use it, that are not sufficiently served by traditional REST? (I know they exist, but I want to get effulgentsia's input on what he's proposing to solve.)

beejeebus’s picture

let's not be so scared of http requests. as we're talking futures, we're talking http2. they're not as scary as they used to be.

evented php etc - i see little of value here without wholesale changes to php itself. *no IO* can be used without blocking everything, or, being rewritten to use the event loop. that rules out most existing php libraries. unless you use threads, which rules out most php extensions, and, well most php code, because who here can say how the php code you've written will work with threading? this is system level threading, so we have to know about stack vs heap memory, critical sections etc. 'clean, "stateless" OO will make it easier' is just architecture astronaut hand-waving.

websockets. they are not request - response, which is much of what we want. (much of the rest is big-pipe, and doesn't require websockets at all.) you can't assume that any message you get in a websocket client is a reply to any message you sent from the client without building a custom, hand-built protocol on top. also, websockets don't reuse TCP connections (one per tab) as efficiently as http2 (one per browser). please stop with the hand waving.

i'm interested in the Wim's questions in #3 and catch's reply in #22.

xjm’s picture

Stepping back, another huge point to keep in mind is that we just went through a huge round of decision paralysis around Drupal while we waited for Drupal 8. Clients understandably were uncertain about moving forward with Drupal projects when such significant changes were on the horizon. Moving towards a complete rewrite of the theme system in JS would reintroduce exactly that sort of quandary that could paralyze clients moving forward with Drupal projects.

Thanks @mdrummond for bringing this up. An important difference between D7-D8 and D8-D9 (in theory and hopefully in practice) is that D8 has minor versions and D9 is not going to even open for development for quite a while. The whole point of the new release cycle is that the main thrust of our development efforts go into predictable, scheduled, BC-safe D8 minors, with Drupal continuously improving rather than waiting on all the magic improvements unknown years in the future.

I'm not saying I necessarily endorse this proposal in any way shape or form (TBH it fills me with terror), nor that I necessarily think such a change would be a good fit for the next major version. I do think Drupal needs to continue to evolve with the web in general. And while I don't speak for @effulgentsia, I am positive it is just a proposal -- a provocative, possibly crazy one, that might be answered with a resounding "uh nope, that is not a good idea". Finally, from my understanding of the governance policy, theme system maintainers would need to be given opportunity to sign off on significant changes to said theme system, and I imagine themer experience would be a very important consideration.

effulgentsia’s picture

Status: Needs review » Active

And while I don't speak for @effulgentsia, I am positive it is just a proposal

Yep, and to make that clearer, setting this to "Active" instead of "Needs review". In the case of core policy issues, I'm not really all that clear on the difference (for issues with patches, the difference is usually whether there's a patch or not), but if nothing else, there's already a lot of feedback on this issue I need to respond to.

benjy’s picture

An alternative approach would be to realize that Drupal is not the CMS, but rather the community. Given that, the community Drupal could produce several PHP libraries, a CMS, as well as some serverside rendering in JS.

Although, much like many people hope bringing in new frontend frameworks will attract more frontend developers, changing most the backend to JS may push away much of the current developer community. I for one would not be interested in contributing to Drupal that was primarily JS.

chx’s picture

What's missing from this discussion, for me at least is the clear cut massive win counterbalancing the massive hit it'd make on the Drupal community.

webchick’s picture

IMO at least part of it would be the same massive win we got (or hope to get) from "getting off the island" with PHP. There's a massive developer community around JavaScript that's only growing, and there's a massive trend in site building towards "the rise of the front-end developer." I can try and dig up some stats later, but compare # of Node modules to # of CPAN modules in 1/5th the time for example, or the growth of GitHub projects on JS compared to any other language, etc. Just like the hot-bed innovative environment around PHP at the time it was selected; JavaScript is in a very similar place today.

The main question would be whether we can do enough work on the front-end in D8 to grow a community of JS enthusiast contributors who could pull something like this off, because I agree with you that the current PHP-centric community would most likely not do any of that work and focus on other endeavours on the API side. OTOH, they might find that professionally within the next 2-3 years they have no choice but to embrace JS a lot more, given where the web/apps/etc. are going.

I do also think it's a bit of a crazy proposal, and I'm not necessarily advocating for it since we just re-wrote our whole damn CMS already ;), but I can certainly see the potential benefits if we could muster the collective will.

Crell’s picture

There's a key difference though, webchick. When PHP began reinventing itself into a more cohesive, collaborative, component-based community there was pushback, yes, but generally there was a consistent direction, and the big and influential names in the community were at the forefront of that transition. Generally speaking, projects that developed a decent following didn't die out soon after. (Symfony and Zend are both going strong, even as they've been joined by others like Laravel that build on them.)

In contrast, the JS community publishes articles daily on what a total mess they're in with incomprehensible toolchains. (I read about one a week lately.) The idea of a "JS-based front-end" is met with pushback and outright condemnation from most of the established names in the front-end world because it directly flies in the face of web standards, accessibility, progressive enhancement, performance, crappy-network-friendliness, etc. I do not even remotely see "the web" going to the point that a 200 KB JS library is a requirement for most pages. Before that it was Silverlight. Before that Flex. Before that Flash. All of these front-end-coding-so-you-don't-need-HTML movements come and go and get proven wrong in the long term. So:

OTOH, [backend developers] might find that professionally within the next 2-3 years they have no choice but to embrace JS a lot more, given where the web/apps/etc. are going.

I couldn't disagree more. You're basically saying that Node.js is going to kill off PHP, which the Node.js folks have been predicting for years and still haven't. :-) Async killing off CGI? That I could see, and would rather like to see, but that has nothing to do with Node.js or Javascript per se.

Wim Leers’s picture

# of Node modules

>95% of them are "me too!" projects.

(I didn't even realize this until I read it somewhere.)

If I'd be snarky, I'd say this is basically the Not Invented Here syndrome on an unprecedented scale: instead of communities/projects reinventing things, it's every other JS developer reinventing things. Sustainable success comes from collaboration. # of Node modules is a symptom of a lack of collaboration.

benjy’s picture

Sustainable success comes from collaboration. # of Node modules is a symptom of a lack of collaboration.

100% agree with this, I've done some React over the holidays and the JS eco-system is one of the most fragmented i've ever worked with. I installed 4 Node modules and then I had 400+ dependant modules. I think it has plenty of maturing to do.

nod_’s picture

Same here, we need to wait for the JS community to get it's things together. It'd just be headaches after headaches if we start before things are less insane. There is no real benefit to having a brand new messy layer in our stack.

If we're talking about a spinoff project, why not. But core shouldn't be the place to manage it.

mdrummond’s picture

I believe there's a fundamental difference between moving to Symfony and the notion of replacing our front-end entirely with JS.

Yes, bringing in Symfony and other commonly used PHP tools was a big change, in large part due to a shift in PHP programming styles. Using more OO approaches enabled by more recent versions of PHP and change within the PHP community is a big deal, but in keeping with what Drupal is: a PHP-based content management framework/system.

Yes, current Drupal developers need to learn new PHP conventions, but those are arguably skills that would be useful for working with other PHP-based projects. This does open the door for other non-Drupal PHP developers to become acquainted with Drupal by being more familiar with the conventions used in Drupal 8. The same is true to a lesser extent for those familiar with OO from other languages but less familiar with PHP.

I'm not at all convinced the same synergy is true for front-end developers if Drupal started using JS for the entire theming system.

The word front-end developer spans a huge range of skillsets. The essence of a front-end developer is making a design and its associated UX work within browsers. Some front-end developers focus more on crafting semantic accessible markup, styling with Sass and implementing behaviors with jQuery or pure JS. Some front-end developers like working with a particular JS framework or experimenting with a range of frameworks. Some are really interested in web components or pure JS. Some people who interact with Drupal's front end are comfortable working with a particular contrib base theme and using the default classes output by Drupal as hooks to implement styling changes. There really is a very wide range.

Rewriting all of Drupal's front end in one particular framework **might** end up pleasing those who like that particular framework. However, a lot of these frameworks like Angular and React tend to be used for single page apps. (Not exclusively of course, they have been put to good use on more content-based sites as well.) Are those using a framework for a single page app going to become interested in using that framework in conjunction with a PHP-based content management system? I'm not sure that's true. Conversely are people who like React going to be more excited about Drupal if we pick Angular? Will Angular and React fans be thrilled if we pick Ember because amongst other things we like its licensing requirements the best?

Let's say in an ideal world there is little bikeshedding and magically we all agree that JS Framework X will be the best thing ever, so we will start implementing components in it. If the goal is to make that a universal tool that contrib can make use of, it's not going to be integrated right away. It took years to complete the Twig conversion. Particularly since we don't want to break BC, it's not unrealistic to say that's the sort of timeframe we're looking at for something like this.

Are the people who are currently fans of JS Framework X still going to be fans by the time we complete the implementation? Backbone and Underscore looked like good choices for more robust JS interactions before, and that hasn't panned out in terms of community excitement.

In fact I've heard almost no excitement amongst front-end developers that Drupal 8 uses Backbone and Underscore. The thing they are excited about is Twig, because finally it's going to be a lot easier for a wide range of people to work with Drupal's markup. That's useful for all the wide variety of people who work with Drupal's front end.

However trying to start a conversion to one particular JS framework and changing the entire theme system of Drupal puts that easy access to markup at risk. Yes, because we aren't breaking BC, maybe that doesn't happen for a few years, but it's still a risk. Sure maybe we use Twig.js and leverage Twig for our JS too. But that's unlikely to be an option with the top-rated JS frameworks we are presented with as the options we can provide feedback on.

So, Drupal front-end people who are not big on working in JS may well be turned off by working with something that may require more in-depth JS programming chops. And the people I've talked to who really are into JS aren't that excited about this either, because it locks us into one particular JS framework, cutting off the ability to work with other options that might be more well suited to a particular task.

Some of the comments are that we'll just use a JS framework for back-end admin tasks, but this issue seems a prime example that it would possibly expand beyond that.

There have been some really good comments about how there may be other alternatives to a JS framework in order to provide snappier UI interactions. I'm very interested in seeing what we can do with PHP to improve things. I'm not at all convinced that using a particular JS framework provides benefits that are several orders of magnitude better that it would be worthwhile causing such a sea change in how Drupal's front end works. These JS frameworks tend to be heavy handed and increase page load times significantly, particularly on mobile: degrading our mobile experience undermines another successful D8 initiative.

Anyhow, others have talked about the alternatives for a snappier UX experience aspect. I just really wanted to address the fact that the notion that this will help attract more front-end developers to Drupal seems to me to be not true, or at least certainly not true for many front-end developers.

webchick’s picture

Apparently this is not readily obvious, so: these are my opinions, not Acquia's opinions, OCTO's opinions, etc.

Was that comment intended for the other thread? This one is talking about Drupal 9 (aka 2-3+ years out) and to what extent it should be JavaScript-ified. The other one is talking about adopting a front-end framework, but since it's targeted for 8.1.x+ it's not expected to throw out Twig, but work alongside it, like Backbone does today. (Hence, "progressive" decoupling.)

For me this is less about front-end developer adoption though and more about developer experience/meeting peoples' expectations/right tool for the job. If you showed Facebook's commenting system to an average web developer and said "build me that, please" chances are they would be reaching for tools like Angular, Ember, React, and Node. They would not be reaching for Icicle and ReactPHP (they wouldn't even know those exist, and their hosts certainly wouldn't support them without a lot of manual fiddling). And if you asked the average user of a website what they expected web UIs to work like, they would cite Facebook, Google, etc. because that's what they've become accustomed to. So, there you go.

Also, (controversy alert) to me at least, one of the big threats to Drupal actually is that there's a dwindling, and aging, group of people who thinks of PHP as anything other than something along the lines of COBOL; something old farts had to use back in the day to prop up technology of the day. ;) As one of those old farts, I love that some of the other old farts are injecting new life into the language, and that'll likely sustain it another 10+ years which is awesome for job security. But, generally speaking, PHP is not being taught in CompSci courses. It's not being taught in STEM training in high schools. These days, it's something you basically learn because you have to, and thanks to Twig, even fewer Drupal people will have to now. So we should certainly explore those ideas and not dismiss them, but I don't think they solve the basic problem.

If we want to be really forward-thinking, we should at least be willing to entertain the idea of going back to Drupal's roots and what made its growth surge its first 15 years: "skating to where developer momentum is." There were 100 poll scripts in PHP back in the day too (and by now probably 5000), so the fact that the JS community is fragmented doesn't really seem like a reason not to at least recognize that the language/ecosystem is quickly growing in importance.

prestonso’s picture

#6, #24: Broadly, I agree that the constant upheaval in server-side JavaScript solutions complicates this discussion and the implementation timeframe. However, I also want to emphasize as @pwolanin did in #10 that a PHP-based approach with WebSockets would be a similarly colossal undertaking.

Leveraging Ratchet (asynchronous WebSocket server) and its dependency ReactPHP (Node.js-like asynchronous, non-blocking I/O for PHP) would require some rethinking of the server architecture backing Drupal. Given Apache was not designed to handle persistent connections and relies on the request/response paradigm, its resources are depleted rapidly when holding open many connections, thus making realtime with PHP applications difficult and, in particular, meaning techniques like long-polling and streaming are prohibitively expensive.

Ratchet runs as a standalone process extrinsic to Apache. Though this information may be outdated, Phil Leggetter (realtime push specialist) wrote a few years ago that options for realtime in PHP are very limited and recommended outsourcing the realtime functionality either to a hosted service offering, a parallel self-hosted solution using message queues in Apache with a service such as Ratchet, or a reverse proxy using some unifying protocol. Either way, the impact on server architecture could be immense.

I agree with @Crell that the Drupal community should at the very least consider ways to reach parity with Node.js without relying on solutions outside of PHP. But it seems to me that, as @Crell suggested several years ago, considerable re-architecting of Drupal would be required to transform Drupal into better-encapsulated, dependency-injected objects.

Crell’s picture

prestonso: Wow, good to know I've been consistent. :-)

As I argued in Barcelona, I think we've done enough at this point that going the rest of the way is doable. Not easy, not trivial, but doable. And ideally we could do it in bits and pieces. Let's not under-estimate how much work went into D8 that will pay off dividends long-run.

At the same time, though, the most important system to make available via a websocket is Entity API, which is still highly container-coupled and non-reentrant. Which is... unfortunate. Let's not under-estimate how much work there is left to do yet. To know how much, someone needs to try it.

To be clear, I am *not* advocating moving all of Drupal to ReactPHP. I am suggesting that if we want websocket support for Drupal, ReactPHP, Icicle, Aeris, or other PHP-based solutions are a far superior starting point than Node.js, with less work and less disruption in the long run.

rlmumford’s picture

#37: There are projects on github that are already working towards running symphony applications on Ratchet/React: https://github.com/Blackshawk/SymfonyReactorBundle But still a long way to go on making drupal ready.

#35: I graduated in Computer Science from the University of Manchester 4 years ago and the two web technologies we studied were PHP and Java (Java Server Faces), my friend in his first year on the same course is still being taught PHP in his first year. I don't think we need to worry too much about Universities not teaching PHP - Nearly all of them seem to use C for procedural programming and Java to teach OO. There was no JS at all in my degree course.

heddn’s picture

I was shocked that 8 years ago when I was finishing my CIS degree I had to take a *required* 8086 assembler class. Because the school thought it was good for me!?!? Or maybe they didn't want to revamp their 10 year old classroom material? Or were lazy? Not sure. I don't think that colleges/universities can keep up with the speed of technology. Hm... me wonders if they still require that 8086 class...

chx’s picture

I am sure missing something. https://github.com/jakubkulhan/hit-server-bench/blob/master/README.md Where's the overwhelming win in ReactPHP vs PHP-FPM? If we are talking frontend support look at the latency numbers. And yes, node.js does have significantly lower latency. Until someone can show a benchmark showing significant wins for ReactPHP let's shelve that discussion and go back to node.js vs PHP.

benjy’s picture

If you showed Facebook's commenting system to an average web developer and said "build me that, please" chances are they would be reaching for tools like Angular, Ember, React, and Node.

If it was just the comment form though, they'd be introducing a framework for something they could have solved with jQuery/Core tools. If the entire site/app worked in the same way, then sure.

one of the big threats to Drupal actually is that there's a dwindling, and aging, group of people who thinks of PHP as anything other than something along the lines of COBOL; something old farts had to use back in the day to prop up technology of the day. ;) As one of those old farts, I love that some of the other old farts are injecting new life into the language, and that'll likely sustain it another 10+ years which is awesome for job security. But, generally speaking, PHP is not being taught in CompSci courses.

Do you have evidence of these stats? In the last 5 years i've been to college and uni and they both taught PHP, the university I went to still does today because I know a few people on the same course. Also, the trend at universities was less towards JavaScript and more towards Microsoft products like C# from my experiences.

I also find it very hard to see PHP having any kind of decline within 10 years, there are a lot of developers learning WordPress at university like I just mentioned and they're all in their early 20s.

cweagans’s picture

Also, (controversy alert) to me at least, one of the big threats to Drupal actually is that there's a dwindling, and aging, group of people who thinks of PHP as anything other than something along the lines of COBOL; something old farts had to use back in the day to prop up technology of the day.

I don't think we need to worry too much about Universities not teaching PHP - Nearly all of them seem to use C for procedural programming and Java to teach OO. There was no JS at all in my degree course.

In the last 5 years i've been to college and uni and they both taught PHP, the university I went to still does today because I know a few people on the same course.

Since this is all highly anecdotal, I'd like to mention that not only does my current university teach PHP, they also teach Drupal. I was also able to get a PHP/Drupal class into the catalog for a couple of years at a different local university. In the former case, it's taught because it's easy to deploy and there's still a huge market for PHP developers. In the latter case, they taught it because my employer at the time was averaging 1-2 technical hires per month, and a CS grad with PHP and Drupal experience was an easy win. I'm not sure if they're still teaching it, mainly because the person that was teaching it got tired of explaining the weird shit in Drupal.

In any case, while being very rant-y, https://medium.com/@wob/the-sad-state-of-web-development-1603a861d29f#.d... is relevant to this conversation, particularly this quote:

You see the Node.js philosophy is to take the worst fucking language ever designed and put it on the server.

That doesn't seem like an improvement to me ;)

mrf’s picture

Seems like most universities are following MIT and adopting Python as their introductory language.
- http://cacm.acm.org/blogs/blog-cacm/176450-python-is-now-the-most-popula...
- https://www.quora.com/Why-do-most-universities-teach-Python-as-a-primary...

Since we are mainly talking about Javascript in this issue and not rewriting Drupal in Python, I don't think we can really use the fact universities aren't teaching PHP frequently as a bellwether for much.

Another anecdote to add to @cweagans comment, the Academy of Art here in San Francicso offers a Drupal course in their web development curriculum.

Trying to get our finger on the pulse of "the future of the web" is a losing game, and one that Drupal has lost in the past. Gambling our entire front end on one vision of that future is a very risky path for the Drupal project to follow.

Wim Leers’s picture

#24 Regarding HTTP/2 and GraphQL vs REST:

let's not be so scared of http requests. as we're talking futures, we're talking http2. they're not as scary as they used to be.

The cost (latency) surely decreases. But it doesn't become zero or close to zero. There are still at least two reasons for not using REST but using GraphQL (or custom REST endpoints, but that requires a lot of extra development, testing & maintenance, and that is precisely why GraphQL is appealing: it removes the need for custom REST endpoints):

  1. Network latency: even with HTTP/2, the request still has to travel from the client to the server and back again. EU <-> US latency will still be ~100ms.
  2. Server latency: Drupal will still need to bootstrap.
  3. Client latency: the biggest problem with using REST (again: when using only canonical REST endpoints and not custom ones that are optimized for a certain template/view on the client) is that the requests need to happen serially, not concurrently. For example: you first need to know which vocabularies exist before you can request which terms exist in each vocabulary.

Finally: note that the HTTP/2 implementations today don't even fulfill that long-admired promise of making HTTP requests so cheap that you don't need to care about CSS aggregation, image spriting etc. Khan Academy wanted to do that, but found aggregating was still necessary, unfortunately: http://calendar.perfplanet.com/2015/forgo-js-packaging-not-so-fast/. We can only hope that this is not due to fundamental flaws in the design of HTTP/2, and is simply a bug in current browsers and web servers.

Wim Leers’s picture

The more I learn (by reading articles, watching talks, reading the comments here, etc.), the more I think that this discussion is extremely premature.

I think the first step has to be gaining more experience with building progressively & fully decoupled Drupal sites. In particular: once people start using GraphQL, we'll gain more insight — likely it comes with its own set of problems, hopefully it'll be a better & narrower set of problems (I currently believe so). Once popular best practices emerge (for Angular, React, Ember, Elm, whatever, etc. combined with REST, GraphQL, etc.), we'll start to see patterns that we don't yet see today.

Therefore, the things that I think make sense to work on now are:

  1. #1804488: [meta] Introduce a Theme Component Library: make it easy to define new components. Allow modules to declaratively define components, consisting of 1 Twig template and 1 matching asset library (think: /hi.twig, /hi.base.css, hi.theme.css, /hi.js, which automatically results in a modulename/hi asset library). Make it easy to use them in the Render API. Let themes easily and declaratively override its markup and extend (or override) its asset libraries. This can then eventually evolve in the direction of Web Components.
  2. Once we have a working GraphQL contrib module for Drupal 8, investigate what it would take to let Twig templates use GraphQL rather than them getting variables injected. Twig templates using GraphQL instead of preprocessed variables would be an enormous step towards decoupling Drupal's front end and back end. It's a step that would provide tangible benefits much sooner, with far less work, then migrating portions of Drupal to Node.js.
  3. JS testing support on Drupal.org's testbots.

After all of the above is done, I think this discussion will be much more productive and meaningful. Right now, it feels this discussion will keep going in circles, because we're all talking in vague, abstract, overly simplistic terms.

chx’s picture

Removing preprocess long has been a goal (at least since 2012 but perhaps at 2011 Badcamp already? My memory becomes fuzzy but if it's important we can ask quicksketch). Also, #45 makes much more sense. An evolutionary approach always is more workable than jumping off the cliff and assembling the airplane as we fall.

almaudoh’s picture

#45, I agree this is a very sensible approach, and a way to actually get things going in incremental steps.

The more I learn (by reading articles, watching talks, reading the comments here, etc.), the more I think that this discussion is extremely premature.

+1

jumping off the cliff and assembling the airplane as we fall.

Lol

Also, to add my 1 cent, some more usage of Drupal 8 Ajax in contrib space would help us identify areas where D8 Ajax is adequate and where improvements are needed especially in DX. I've recently been trying to replicate Views UI-type modals in a new contrib module and have had an unnecessarily difficult time doing it.

davidhernandez’s picture

+1 to #45. (And thank you for bringing up point 3!) The component library has been discussed often during frontend meetings, and the direction people want to go. It is something that should give us greater flexibility and I think help make a system that can be agnostic to a framework, which would ultimately give everyone what they want.

We definitely need to evolve a solution through real world practice and discovery. How can anyone possibly claim to know what Drupal 8's pain points are right now? After a year of actually using it, we may want to do something completely different. The general consensus from many of the Twig calls has been "Can we #$@&%*! use the thing before talking about how to reinvent it?" That is what I've heard from many people. That no one, particularly frontenders, is interested in reinventing the wheel before getting to know the one we have, especially with the wins we've be getting from Twig. And because of that I wouldn't be surprised if force-feeding a solution (especially a framework) wouldn't lead to a hit in contributors and adopters.

That aside, conversations like this are often fruitful. Sometimes you have to propose a controversial idea to get people thinking, and coming up with better ideas. Hopefully, that is the larger motivation here, and not a force-feeding.

dawehner’s picture

+2 for #45
Yes its good to talk about the future of Drupal, no question, but we should not just jump blindly into some potential future. Let's first learn what we actually need.
There is though also the idea of burnout. If we are doing too radical changes for now or in the far feature, we all can't keep up anymore with the speed.

phenaproxima’s picture

I fully agree with #45 as well. By all means let's make it easier to build better UIs, but moving on to a JavaScript platform seems like too much, too soon.

Crell’s picture

Wim, I have to disagree/correct you on one point: There is no such thing as a "REST endpoint". REST doesn't have endpoints. REST has resources, the URIs of which are not hard coded into clients aside from the root entry point. What you're talking about is adding RPC calls. Just being JSON doesn't mean it's not an RPC call.

Having RPC calls over HTTP is not inherently a bad thing. There are numerous cases where that's totally the right thing to do. But those are simply web service RPC calls. They're not REST. Please don't call them that. Nails on chalkboard. :-)

Otherwise, s/REST/web RPC call/ I agree with #45 as well. There's plenty we can do with the new tools we just got before we need to think about replacing large swaths of it. Let's assume Drupal 8's going to have a bit of a shelf life, m'kay? :-)

(That said, continuing the internal refactoring toward more decoupled stateless services is a good and useful thing to do on its own merits in addition to helping out here in the long run.)

Wim Leers’s picture

Sorry for the broken terminology.

I meant:

  • canonical REST endpoints: the URL/resource that returns e.g. vocabulary X, or term Y, or …
  • custom REST endpoint: the URL/resource that returns a bunch of JSON optimized to render a specific view or template on the client, for example a vocabulary X and all of its terms

i.e. normalized vs denormalized data. Denormalizing the data by creating custom endpoints to reduce the number of HTTP requests is the typical solution today, and was also the solution Facebook used until they came up with GraphQL.

(I'm sure that's still not 100% correct terminology-wise, but hopefully that makes it crystal clear nevertheless.)

markabur’s picture

Sorry, stupid question time: which parts of Drupal comprise the "UI layer"? Thanks.

effulgentsia’s picture

Title: [policy, no patch] Require Node.js for Drupal 9 core and rewrite Drupal's UI layer from PHP to JS » [policy, no patch] Require Node.js for Drupal 9 core and rewrite some of Drupal's UIs from PHP to JS

Thanks, everyone, for all the feedback so far. There's so much meat in there, it'll take me a while to digest it and respond to it. Here's a start:

What's needed isn't Javascript; what's needed is websockets and/or HTTP 2 instead of ancient CGI

As Wim points out in #44, HTTP/2 and a persistent, non-blocking PHP daemon still doesn't remove all latency. He points out there's still currently ~100ms network latency between EU and US, and even if we reach some hypothetical future where the only limitation is the speed of light in a vacuum, that still amounts to ~110ms required by physics for information to travel between Sydney and London and back. Reducing that latency any further requires placing servers closer to devices, or allowing the device itself to re-render in response to user interaction. For the latter, what's needed is JavaScript. If client-side rendering were impossible, we could potentially rely on globally distributed grid computing to eventually make server-side rendering nearly latency-free at a price point that's affordable by the masses, but given that client-side rendering is possible, and that many web applications and native apps are already being built with it, I'm skeptical of server-side rendering becoming free of perceptible latency any time soon.

What's missing from this discussion, for me at least is the clear cut massive win

So just to take a very simple example, suppose we wanted to improve Field UI to remove the latency between selecting a formatter and seeing its corresponding settings form. Currently that transition is done using Drupal's AJAX system, so there's latency while we wait for the server to render. Is there any way to move that settings form to client-side rendering? Well currently, that form is tied up in FormatterInterface::settingsForm(), so if we wanted to explore a client-side rendered UI for this during 8.x, we could potentially allow formatters to optionally provide an additional JS file that duplicates that settings form in JS code. Then our field_ui_ng module could provide a latency-free transition between formatter selection and settings configuration, if the selected formatter implements that additional JS file, but if not, we could fall back to the AJAX call.

Except that wouldn't quite work, because we also have hook_field_formatter_third_party_settings_form(). So field_ui_ng would also need to fall back to the with-latency AJAX approach if there's any module enabled that implements that. Plus once you add all the hook_preprocess_*() functions that might be out there, and the #pre_render functions that might be out there, then you end up having a really hard time getting your client-rendered form to remain accurate as various contrib/custom modules get enabled.

That's just one transition step in one UI. There's probably dozens or hundreds of other examples we could come up with that would have similar difficulty in creating accurate client-side renderings given that all the server-side rendering is locked up in PHP code.

During the 8.x cycle, I think we can explore these kinds of UIs and corresponding issues, and keep everything backwards-compatible and opt-in, but only at the cost of much duplication: duplication of the overall UI itself (e.g., field_ui_ng.module with field_ui.module) and duplication of the various plugins/components that go into that UI (e.g., implementations of FormatterInterface::settingsForm() and their JS equivalents). At some point though, I think this duplication will become very prohibitive: how many UIs will we actually improve in this way when we know that this duplication will need to be maintained?

So it seems to me that it's quite logical for us to eventually move to where these kinds of UIs can be fully expressed in JS, and then run on either client or server as needed.

counterbalancing the massive hit it'd make on the Drupal community.

To be clear, in this issue, I'm "only" talking about UI code. I'm not suggesting that we move any of Drupal's "business logic" away from PHP. I'm certainly hearing the concerns raised in this issue about moving away from Twig, so what if we don't have to? What if we can keep Twig via Twig.js, but "just" refactor PHP UI code like FormatterInterface::settingsForm(), preprocess functions, etc. to isomorphic JS?

Meanwhile, retitling the issue to add in "some of", since we don't necessarily need to do all of the UI code in a single major version jump. Maybe there's a way to just do, for example, the theme component library in JS+Twig, while leaving most other things in PHP? But even a partial step like that would require some kind of server-side JS execution, if we want to preserve a page's initial state being rendered server-side, which in my opinion, we definitely do, both for performance and accessibility.

effulgentsia’s picture

Title: [policy, no patch] Require Node.js for Drupal 9 core and rewrite some of Drupal's UIs from PHP to JS » [policy, no patch] Require Node.js for Drupal 9 core and rewrite some of Drupal's UI code from PHP to JS

Retitling again, because it's not just about complete UIs (like Field UI), it's also about the components that go into those UIs (like formatter settings).

catch’s picture

So just to take a very simple example, suppose we wanted to improve Field UI to remove the latency between selecting a formatter and seeing its corresponding settings form. Currently that transition is done using Drupal's AJAX system, so there's latency while we wait for the server to render. Is there any way to move that settings form to client-side rendering? Well currently, that form is tied up in FormatterInterface::settingsForm(), so if we wanted to explore a client-side rendered UI for this during 8.x, we could potentially allow formatters to optionally provide an additional JS file that duplicates that settings form in JS code. Then our field_ui_ng module could provide a latency-free transition between formatter selection and settings configuration, if the selected formatter implements that additional JS file, but if not, we could fall back to the AJAX call.

This isn't going to be completely latency free though, because at some point all of that template code has to reach the client in the first place. So for field UI we'd need to send every settings form for every formatter. And not only that - if we keep extensibility of forms then all the code to support that as well.

Then for formatter settings forms (and views plugins, inline editing etc. etc.), we'd need to communicate what the default values are - both the initial ones, and if the formatter has been configured, what the configured values are. Otherwise regardless of any restrictions that might be introduced in form altering, the form cannot be rendered correctly without the right default values in it. If it's been configured already, then that's a call back to the server again. Or possibly the entire display configuration could be loaded into the client as well so that the configuration is available up front, but that still has to be got in the first place.

On the other hand, in terms of user experience, the issue is not so much the call back to the server (since that has to happen at some point), it's whether that is blocking or not. For example, when on the field formatter selection, we could make a request to pre-load the HTML for all the rendered formatter configuration forms, then once one is selected, that can be presented immediately to the user with no latency. There's theoretical latency if someone is quick enough clicking on buttons to circumvent the pre-loading, but that feels like a much smaller problem to solve than having to load the full client-side application including all user paths before any interaction happens at all.

effulgentsia’s picture

which parts of Drupal comprise the "UI layer"?

Funny coincidence: I wrote and posted #54 and #55 prior to reading #53. I hope those comments begin to answer that question.

both the initial ones, and if the formatter has been configured, what the configured values are.

The initial state of the UI already carries with it information about what the currently configured formatter is. If you then select a different formatter, then there are no currently configured values that need retrieval from the server in order to show its settings form. You only need the formatter's default settings, which can be part of the JS file (they are currently part of the formatter class's PHP file (defaultSettings() method)).

For example, when on the field formatter selection, we could make a request to pre-load the HTML for all the rendered formatter configuration forms

That doesn't seem very scalable. Maybe for formatters in Field UI, we don't need it to scale, but probably for other UIs, it would be pretty inefficient to render dozens or hundreds of things when only one might be needed.

all of that template code has to reach the client in the first place

Yes, but code can be cached, so you only pay the latency once. Or rather, once'ish, because there are definitely some cache invalidation problems to figure out here: maybe websockets, etc. can help with that though.

having to load the full client-side application including all user paths before any interaction happens at all

Yeah, this would be bad. I don't know enough about how each of the JS frameworks work to know if/how this can be optimized. But if we think in terms of components rather than a full SPA, I would hope there's some way to optimize this.

catch’s picture

Maybe for formatters in Field UI, we don't need it to scale, but probably for other UIs, it would be pretty inefficient to render dozens or hundreds of things when only one might be needed.

How different is this from having to load dozens or hundreds of different client-side templates?

effulgentsia’s picture

How different is this from having to load dozens or hundreds of different client-side templates?

Could the client-side templates be client-side cached such that "loading" them on-demand is virtually latency-free (on the order of 1ms rather than 10ms or 100ms)?

catch’s picture

You only need the formatter's default settings, which can be part of the JS file (they are currently part of the formatter class's PHP file (defaultSettings() method)).

That's not quite right though. As you pointed out above, there is https://api.drupal.org/api/drupal/core!modules!field_ui!field_ui.api.php...

So not only do we need the form itself and the defaults, but we also need to apply that hook.

Then that hook signature currently contains not only the field and instance configuration, but also $form and $form_state. So even the field formatter settings form is dynamic, and will always depend on what's available on the server. Possibly in this case we could drop that flexibility somehow - i.e. make the js version of the hook essentially static or force people to get information from the server themselves (although that would quickly get slow if lots of implementations did it again). But for many forms it won't be an option to make thing static.

Another example in the Field UI would be field configuration forms, for example: EntityReferenceItem::fieldSettingsForm

Or for views something like an entity bundle filter configuration form, which would need the available entity bundles to provide as options.

catch’s picture

The important thing with the latter two examples is that in both cases, the $options and similar do not come from the object configuration itself, but are based on other data on the site.

This is the case for lots of things, entity types, bundles, image styles, text formats, available field formatters, available views filters - all of this will require calls back to the server. They aren't part of the object being configured, but instead potential values for that object which may come from anywhere. Then given we're talking about all of Drupal's front end, there's also things like field access on entity forms - we can't start rendering the form until that information is available - in addition to the widget configuration etc.

Could the client-side templates be client-side cached such that "loading" them on-demand is virtually latency-free (on the order of 1ms rather than 10ms or 100ms)?

I don't think we should rely on browser caches - they're very inconsistent between browsers (quick googling says that Firefox's is 50mb. and Edge's is unlimited and grows forever). Also admin interfaces such as Views or Field UI are often visited quite infrequently, so could be cold cache more often than not. Relying on the browser cache within a single session or maybe day should be fine, but beyond that feels optimistic.

Crell’s picture

I wasn't suggesting that websockets or HTTP2 would allow us to violate the laws of physics. :-) However, the 100 ms latency number being bandied about is, I think, quite misleading. We're talking primarily about admin forms here. How often is someone accessing an administrative form from 11 timezones away? Pretty rarely, I'd wager. In fact, in my own experience upwards of 98% of the time I'm accessing the admin interface Drupal is running on my local laptop. While the latency there is still non-zero, it's about as close to it as it's going to get. Even if someone is building a site directly on Pantheon or Platform.sh or whatnot, I'd venture a guess that they're probably within 2000 miles of the server at most, not 12,000 miles away.

Remember, for any given request the total time is impacted by: network latency, payload size/bandwidth, and server processing time. I'd submit that in the typical case, network latency is not the biggest issue. So if the goal is improving performance, we want to reduce the payload size and server processing time. Ie, send less data or make PHP do less. Those are where a persistent daemon, websockets, etc. are potentially useful (although certainly no silver bullet, either).

For user-facing pages all bets are off in terms of latency, but most of the discussion here and elsewhere is around admin pages.

To catch's point, given how dynamic forms are now I'm not convinced we could truely unify the front and back ends to eliminate the need to talk to the server. The best we could do would be to have a Javascript FAPI equivalent, and then have the server send a JSON version of a form array, in essence. Then the client-side code could render those instructions into a form, using Twig.js or whatever. However, that's still a lot of JS to have to send to the client (which would be true even if the backend were Node.js), and still runs into the #process callback problem. In fact, I'd go as far as saying that as long as forms can contain runtime callbacks rather than being a pure data definition, all bets are off as those callbacks would have to be double-written in both PHP and JS. And that's before even mentioning the security implications of losing the PHP-side validation guarantees we have now.

Unless, of course, as is being suggested we move wholesale to Javascript for form handling so we would only need a Javascript-based FAPI. Which then runs into the issue I mentioned before of needing to be fully bilingual in order to even touch Drupal development, which is not a prospect I would relish.

catch’s picture

How often is someone accessing an administrative form from 11 timezones away?

I've done that several times this week - mainly node forms but sometimes admin pages checking things on a staging environment (didn't count the exact number of timezones, but it's enough). Ping says 144ms. Over 4g it's 167ms.

I'm almost never in the same country or even continent as staging servers. Even small sites can have multi-national admin teams working on them.

Then for comment and node posting, user login, registration, account settings it could be any number of timezone combinations for some sites.

Most site-building I do is on a local machine though, but I don't think that's a safe assumption to make, and not at all for content editing, translation etc. which are all back-end tasks.

So we should neither over- nor under-play network latency here.

Crell’s picture

Fair, although I was viewing node-edit et al as user-facing pages, not admin-facing pages. By admin pages, I mean /admin/*.

We now have two anecdata on how much of a practical (rather than theoretical) issue intercontinental latency is for admin pages. I have no idea how we'd get actual data. I do agree entirely with catch here, though:

So we should neither over- nor under-play network latency here.

Which also means looking at other parts of the process to optimize beyond just that.

Wim Leers’s picture

TL;DR: network latency is a problem we cannot ignore.

However, the 100 ms latency number being bandied about is, I think, quite misleading. We're talking primarily about admin forms here. How often is someone accessing an administrative form from 11 timezones away? Pretty rarely, I'd wager.

I'm sorry, but this argument makes so little sense that it borders on the offensive. And I actually suspect this is because you're American, and are therefore spoiled, internet-latency-wise. Let me explain.

The EU has a larger population than the US, yet many sites are hosted in the US. Including European-owned sites, and including sites with a mostly European audience. Also, it's a difference of only 6 timezones from CET (majority of EU) to the US east coast.

Try using Reddit, or Twitter, or … drupal.org from the EU. It's frustratingly slow. Yes, particularly Drupal is frustratingly slow ;) To get some illustrative numbers, I loaded the drupal.org frontpage as the authenticated user (hence no caching by the CDN d.o uses) 10 times. The fastest TTFB: 550 ms. With huge variance: often it takes 1 full second. That's before assets can even start to be fetched. And that's for a page that could easily be cached.
Twitter.com's fastest TTFB: 150 ms. Pinging twitter.com: 115 ms. So with some rough calculations: 150-115=35 ms spent on their server before they can start sending a response. So I estimate that your TTFB is on the order of 50 ms (35 ms + some intra-US network latency). Every Twitter interaction begins with an additional 100 ms latency. No wonder things often feel slow. And this is for one of the most optimized sites in the world.
And remember that these sites are all using CDNs for their static assets. So the total impact is actually fairly minimal. Non-big budget sites out there don't use CDNs. Which means that we incur that latency for every single asset also. And there you have it: a very slow site. Like, say, your employer's site: https://www.palantir.net/ — the fastest resource on that site is a CSS aggregate that has a TTFB of 102 ms.

So, please don't dismiss this 100 ms as being theoretical.

Furthermore, it's intellectually dishonest to dismiss this 100 ms on the grounds of it not being not a significant duration. Especially because this 100 ms is the best case. Add in crappy networks (consumer internet connections around the world often suffer from unexplained, intermittent slowness) and on mobile devices (4G is fast! Except when on a train it has >5 second round trip times, often even >20 seconds. Many people work on the web from a train every day.)

Network latency is significantly larger than the Drupal 8 bootstrap in many situations. 100 ms = many D8 bootstraps.

As much as I would like to say "Let's improve AJAX API and use web sockets", they do not solve these increasingly common real-world scenarios. People use fragile connections to do real work. Fragile connections have significant network latency.

So we should neither over- nor under-play network latency here.

Exactly.

And this is exactly a traditional weakness in Drupal's JavaScript: most of it is written with the assumption that timeouts never happen. Who of you hasn't seen the autocomplete here in the forms for these very issues cause an AJAX error? Caused by this beauty in our AJAX system:

  try {
    // Network I/O here!
  }
  catch (e) {
    // Unset the ajax.ajaxing flag here because it won't be unset during
    // the complete response.
    ajax.ajaxing = false;
    alert("An error occurred while attempting to process " + ajax.options.url + ": " + e.message);
  }

Error handling? Hah! Let's let the end user handle the error! (Quick Edit is the only code in Drupal core to my knowledge to have even the most rough network error handling to help the user rather than scare the user.)

Which also means looking at other parts of the process to optimize beyond just that.

Of course!

But network latency is something that software cannot control. It can vary tremendously. The only way to not be impacted by it is to use the network less. Which means doing more on the client side, either by moving logic to JS (hard), or by caching more things on the client side (hard), or something else (surely also hard). Of course, more JS means also more to download and invalidate, so potentially more things in the critical path. So it's a very tough thing to balance.

I wish we didn't have to think about network latency. Then we could close this issue right now.

We developers need to stop testing sites on our fast computers with locally hosted instances and the latest browser versions. That's not how real users experience what we build.


I realize that was a bit rant-y, but I think that it was perhaps necessary: it seems few people consider all ramifications of network latency and network errors. Hopefully this helps.


I still stand by #45. This is just explaining that network latency matters greatly, and similarly so do network errors. I'm not approving a particular direction. Merely explaining a problem that Drupal will need to address at some point.

almaudoh’s picture

Try using Reddit, or Twitter, or … drupal.org from the EU.

...try from Africa, where most internet is served by 3G GSM networks... :(

rlmumford’s picture

kid_icarus’s picture

+1 Drupal 9 rewritten entirely in Node.js

giorgio79’s picture

+1 for Node JS and +1 for #29 webchick

JS has become the biggest programming language. Just checkout the github stats at http://githut.info with almost 3x as many active repos as php. So, maybe it's time to get off the smaller php island, and step on the bigger js one :P

Also, been working on a Google Docs plugin, and pretty much the server side is gone! Google APIs are all javascript based.
Caching sg is as simple as calling
cache.put('foo', 'bar');
https://developers.google.com/apps-script/reference/cache/cache#put(String,String).

If not Drupal, certainly another JS based CMS will emerge.

corbacho’s picture

I think that for a huge change like this (Drupal + nodejs) it would be easier if it would start like Acquia Drupal Spark initiative, developed in-house privately but publicly. Otherwise, design by committee aproach, made by scarce Drupal JS dev volunteers would take too much time, and drain the energy of both JS developers trying to avoid all the bikeshedding, but also draining the energy of PHP devs/themers that think.. why to learn D8, when is going to be radically different in D9? (This happened with Angular 1.x / 2.x transition, people felt their skills were suddenly, useless)

For example, Automattic's Calypso project is the result of an "experiment" of a private company during 20 months, 127 contributors, all or mostly employees. They didn't force it down the throat as requirement to normal wordpress.org Software users, only to wordpress.com "online users", so there was not friction. And also notice that Calypso is open source, yes, but totally controlled and hosted by Automattic. You can't write, let's say, a plugin for Calypso.

nodejs deployment, maintenance and hosting is a pain. It's a constantly changing ecosystem hard to keep up. I wonder if this would change

droplet’s picture

@corbacho,

Well-said!

markabur’s picture

nodejs deployment, maintenance and hosting is a pain. It's a constantly changing ecosystem hard to keep up. I wonder if this would change

I wondered about this over in the Backbone.js thread, and that part of my comment probably should have been posted here instead. I wrote:

I'm more interested in the business side of this discussion. Currently I spend 2-3 hours per D7 site per year installing security updates. It's super-easy and rarely troublesome. How would that change if Drupal gets rewritten in Node.js or some other Javascript framework? Could it possibly be lower? I doubt it. How could I justify higher maintenance costs to clients if most of the benefits are for JS developers and site builders? Saving a few milliseconds here and there (possibly, sometimes, depending on the connection) on administrative forms would not lower my development costs. When I'm building sites in Drupal I spend far more time thinking about what to do next than actually clicking buttons or waiting for the Ajax spinner.

[edit: snipped a bunch of unnecessary personal background info]

Finally, I have what is hopefully a constructive idea. If people want Drupal to be a Node.js-based CMS, maybe it would be good to review the other Node.js CMSes out there and see how they work? What can we learn from them?

Crell’s picture

@corbacho: Automattic's Calypso is, as I understand it, a desktop app written using Electron, the same tool that Slack's desktop app uses, to use HTML/CSS/JS for desktop development. It's more akin to Adobe Air. As a desktop app it talks back to the server using only Web Service APIs. (I don't know if it's doing any websockets under the hood or not; I think Slack does, though.)

That's entirely different than what is being discussed here... That said, I would be very interested to see someone try to build a similar desktop client for Drupal 8. It would be a major test and validation (or perhaps condemnation?) of our API support, and help drive fixing the remaining issues with it. If anything, I think that's more valuable, at this point in time, than either this issue or #2645250: [META] Supersede Backbone in core admin UIs with a new client-side framework.

bojanz’s picture

@Crell
Calypso is a React app, the desktop part is irrelevant (and comes as a perk of the stack).

larowlan’s picture

That said, I would be very interested to see someone try to build a similar desktop client for Drupal 8. It would be a major test and validation (or perhaps condemnation?) of our API support, and help drive fixing the remaining issues with it. If anything, I think that's more valuable, at this point in time, than either this issue or #2645250: [META] Supersede Backbone in core admin UIs with a new client-side framework.

I agree with this.

One thing that hasn't been mentioned in any of these issues is EXT JS (Formerly YUI) (there is a GPL (v3) version https://www.sencha.com/legal/gpl/). So yeah licensing might be tricky given GPL v3 isn't compatible with GPL v2 but if we're building 'desktop like apps' then it is by far the most mature JavaScript framework. It is now up to version 6 and has already addressed many of the short-comings being raised about other SPAs e.g. a11y. How many JavaScript frameworks from 2006 are still around?

Before I came to Drupal, I was building SPA like full-blown custom ERPs using EXT JS. I even built some for Drupal, using custom page callbacks that sent/received json (we're talking D5 and D6 here).

Docs are here http://docs.sencha.com/extjs/6.0/

Example here http://examples.sencha.com/extjs/6.0.0/examples/admin-dashboard/

FWIW Ext JS is what Xero uses.

catch’s picture

@bojanz it's react, but it's 100% decoupled and only deals with Wordpress as a REST API.

corbacho’s picture

FileSize
187.51 KB

Calypso is a React app for wordpress that you can use 3 ways:
* Web version, to manage your wordpress.com blog. See screenshot attached of the admin UI of my test blog in wordpress.com
* Web version, to manage your self-hosted wordpress blog, through JetPack plugin (XML-RPC)
* Desktop version, as Crell said, encapsulated in Electron. I don't think is so relevant, as the web version

Good admin UX, without the burden of hosting anything. Do you know any blogger who wants to deploy/maintain nodejs ?

We are doing here the opposite: If we require nodejs as a dependency, we are adding more friction and troubles for site builders who wants to use Drupal 9. It would be the final prove for the world that Drupal is not anymore for site builders, just enterprise.

Here they explain quite well, what is the trend, some food for though:
http://chrislema.com/success-of-wordpress/

I'm not sure how Drupal could take a similar approach, without a big company stepping in. I can't imagine something like Calypso built on voluntary basis.

mrf’s picture

@markabur In terms of CMS "competitors" written purely in Node, I've been working my way through getting these installed locally in order to evaluate them:
http://blog.budacode.com/2015/05/08/node-js-cms-framework-comparison/
and
http://y-designs.com/blog/node-cms-comparison-2015/

The setup for a lot of these proved more difficult than my limited time for this type of exploration allowed, but if I ever finish I promise to write the blog post that triggered this exploration.

Even in my limited exploration I learned a lot about how a Node-based CMS feels like compared to what we are used to seeing. It was also interesting to see what solutions others came up with when approaching the CMS problem with fresh eyes and fresh tools. I'd encourage others to experiment with these as well.

philipz’s picture

I wanted to put some light on potentially amazing project that is being build right now by the creators of Meteor. The new project is named Apollo and I think it could have a potential to be part of the transformation discussed here.

So basically it's a GraphQL-based platform that has two main components.
The first one is Apollo server capable to connect to multiple backends like MongoDB, REST endpoints or SQL database.
On the client side there's a second component - Apollo client - which is integrated with a front end frameworks/libraries like React+Redux or Angular 2 and of course Meteor.

Besides that unified data access it's build with all amazing features that Meteor has like reactivity and optimistic UI updates to name just a few.

The developement of the project is moving at incredible pace and there's a technical preview available already:
https://medium.com/apollo-stack/introducing-the-apollo-graphql-data-stac...

I just wanted to let you know about that because when I discovered it I immediately saw a possiblity for this technology to be used in building headless Drupal websites but also maybe this technology could be a fundament for progressively making the bigger transition.

They also have been publishing a lot on Medium:
https://medium.com/apollo-stack

fgm’s picture

@philipz, fubhy has already done a lot with GraphQL, see at http://zensations.at/blog/graphql-coming-drupal

Also, FWIW, there are now a number of Drupal/Meteor integration solutions, even for D8 (including one we built), so I confirm this can be an interesting trend for rich UIs in front of a Drupal store, with or without GraphQL.

adam.weingarten’s picture

There are a lot of good ideas here. It seems that we are bickering about implementation before talking about the technical and business drivers here. If we can talk about those drivers and agree to some common high-level goals then there can be a meaningful discussion about implementation/technology.

1: As Drupal matures we add more layers of abstraction. Layers of abstraction slow things down. Nothing wrong with that. There is a reason we no longer code in assembly. There is tremendous overhead to the Drupal bootstrap. D8's caching helps hide it. Uncached Drupal will inevitably get slower though.

2: Drupal is a great CMS. It's a crappy transactional processor. If you look at the Symfony2 transactional speeds are mediocre at best. This means that Drupal will not scale well for high-volume transactions. Imagine a webform that gets 10s of thousands of posts a minute. Drupal might not be the best platform to handle that load.
https://www.techempower.com/benchmarks/

3: The lack of high-transactional throughput means that while Drupal could theoretically do Websockets it could not do so at scale.

Overall it seems like the central problem is issue how do we get data in and get data out quickly and at scale. If we can make Drupal a better transactional processor it doesn't really matter what technology we use.

catch’s picture

Project: Drupal core » Drupal core ideas
Version: 9.x-dev »
Component: javascript » Idea

Moving to the ideas queue.

Fabianx’s picture

Chiming in late, but +1 async PHP (also see my "Future of Drupal Performance" session from New Orleans for some action points how to achieve that), and -1 for NodeJS in general.

If someone wants to experiment with a NodeJS or Apollo server that uses the Drupal rest / graphql server (from API first initiative) as backend, for sure.

But rewriting core is most likely not the best idea for that.

The idea of making the data processors (FAPI / REST) and data preparation layers more independent from the presentation layer has its merits though.