Composer BoF at DrupalCon Baltimore

Posted by Jeff Geerling's Blog - 25 Apr 2017 at 20:07 UTC

Tomorrow (Wednesday, April 25), I'm leading a Birds of a Feather (BoF) at DrupalCon Baltimore titled Managing Drupal sites with Composer (3:45 - 4:45 p.m. in room 305).

Composer for PHP - Logo

I've built four Drupal 8 websites now, and for each site, I have battle scars from working with Composer (read my Tips for Managing Drupal 8 projects with Composer). Even some of the tools that I use alongside composer—for project scaffolding, managing dependencies, patching things, etc.—have changed quite a bit over the past year.

As more and more Drupal developers adopt a Composer workflow for Drupal, we are solving some of the most painful problems.

For example:

Does Drupal have a minor upgrade problem?

Posted by aleksip.net - 25 Apr 2017 at 17:47 UTC
Drupal 8 has a new upgrade model, and the promise is to make upgrades easy forever. The idea behind the upgrade model is great, and has already been proven in other projects like Symfony. However, there might still be some issues that need to be solved, as demonstrated by the recent 8.3 release and the security release that followed it.

Drupal is API-first, not API-only

Posted by Dries Buytaert - 25 Apr 2017 at 16:59 UTC

More and more developers are choosing content-as-a-service solutions known as headless CMSes — content repositories which offer no-frills editorial interfaces and expose content APIs for consumption by an expanding array of applications. Headless CMSes share a few common traits: they lack end-user front ends, provide few to no editorial tools for display and layout, and as such leave presentational concerns almost entirely up to the front-end developer. Headless CMSes have gained popularity because:

  • A desire to separate concerns of structure and presentation so that front-end teams and back-end teams can work independently from each other.
  • Editors and marketers are looking for solutions that can serve content to a growing list of channels, including websites, back-end systems, single-page applications, native applications, and even emerging devices such as wearables, conversational interfaces, and IoT devices.

Due to this trend among developers, many are rightfully asking whether headless CMSes are challenging the market for traditional CMSes. I'm not convinced that headless CMSes as they stand today are where the CMS world in general is headed. In fact, I believe a nuanced view is needed.

In this blog post, I'll explain why Drupal has one crucial advantage that propels it beyond the emerging headless competitors: it can be an exceptional CMS for editors who need control over the presentation of their content and a rich headless CMS for developers building out large content ecosystems in a single package.

As Drupal continues to power the websites that have long been its bread and butter, it is also used more and more to serve content to other back-end systems, single-page applications, native applications, and even conversational interfaces — all at the same time.

Headless CMSes are leaving editors behind

This diagram illustrates the differences between a traditional Drupal website and a headless CMS with various front ends receiving content.

Some claim that headless CMSes will replace traditional CMSes like Drupal and WordPress when it comes to content editors and marketers. I'm not so sure.

Where headless CMSes fall flat is in the areas of in-context administration and in-place editing of content. Our outside-in efforts, in contrast, aim to allow an editor to administer content and page structure in an interface alongside a live preview rather than in an interface that is completely separate from the end user experience. Some examples of this paradigm include dragging blocks directly into regions or reordering menu items and then seeing both of these changes apply live.

By their nature, headless CMSes lack full-fledged editorial experience integrated into the front ends to which they serve content. Unless they expose a content editing interface tied to each front end, in-context administration and in-place editing are impossible. In other words, to provide an editorial experience on the front end, that front end must be aware of that content editing interface — hence the necessity of coupling.

Display and layout manipulation is another area that is key to making marketers successful. One of Drupal's key features is the ability to control where content appears in a layout structure. Headless CMSes are unopinionated about display and layout settings. But just like in-place editing and in-context administration, editorial tools that enable this need to be integrated into the front end that faces the end user in order to be useful.

In addition, editors and marketers are particularly concerned about how content will look once it's published. Access to an easy end-to-end preview system, especially for unpublished content, is essential to many editors' workflows. In the headless CMS paradigm, developers have to jump through fairly significant hoops to enable seamless preview, including setting up a new API endpoint or staging environment and deploying a separate version of their application that issues requests against new paths. As a result, I believe seamless preview — without having to tap on a developer's shoulder — is still necessary.

Features like in-place editing, in-context administration, layout manipulation, and seamless but faithful preview are essential building blocks for an optimal editorial experience for content creators and marketers. For some use cases, these drawbacks are totally manageable, especially where an application needs little editorial interaction and is more developer-focused. But for content editors, headless CMSes simply don't offer the toolkits they have come to expect; they fall short where Drupal shines.

Drupal empowers both editors and application developers

This diagram illustrates the differences between a coupled — but headless-enabled — Drupal website and a headless CMS with various front ends receiving content.

All of this isn't to say that headless isn't important. Headless is important, but supporting both headless and traditional approaches is one of the biggest advantages of Drupal. After all, content management systems need to serve content beyond editor-focused websites to single-page applications, native applications, and even emerging devices such as wearables, conversational interfaces, and IoT devices.

Fortunately, the ongoing API-first initiative is actively working to advance existing and new web services efforts that make using Drupal as a content service much easier and more optimal for developers. We're working on making developers of these applications more productive, whether through web services that provide a great developer experience like JSON API and GraphQL or through tooling that accelerates headless application development like the Waterwheel ecosystem.

For me, the key takeaway of this discussion is: Drupal is great for both editors and developers. But there are some caveats. For web experiences that need significant focus on the editor or assembler experience, you should use a coupled Drupal front end which gives you the ability to edit and manipulate the front end without involving a developer. For web experiences where you don't need editors to be involved, Drupal is still ideal. In an API-first approach, Drupal provides for other digital experiences that it can't explicitly support (those that aren't web-based). This keeps both options open to you.

Drupal for your site, headless Drupal for your apps

This diagram illustrates the ideal architecture for Drupal, which should be leveraged as both a front end in and of itself as well as a content service for other front ends.

In this day and age, having all channels served by a single source of truth for content is important. But what architecture is optimal for this approach? While reading this you might have also experienced some déjà-vu from a blog post I wrote last year about how you should decouple Drupal, which is still solid advice nearly a year after I first posted it.

Ultimately, I recommend an architecture where Drupal is simultaneously coupled and decoupled; in short, Drupal shines when it's positioned both for editors and for application developers, because Drupal is great at both roles. In other words, your content repository should also be your public-facing website — a contiguous site with full editorial capabilities. At the same time, it should be the centerpiece for your collection of applications, which don't necessitate editorial tools but do offer your developers the experience they want. Keeping Drupal as a coupled website, while concurrently adding decoupled applications, isn't a limitation; it's an enhancement.

Conclusion

Today's goal isn't to make Drupal API-only, but rather API-first. It doesn't limit you to a coupled approach like CMSes without APIs, and it doesn't limit you to an API-only approach like Contentful and other headless CMSes. To me, that is the most important conclusion to draw from this: Drupal supports an entire spectrum of possibilities. This allows you to make the proper trade-off between optimizing for your editors and marketers, or for your developers, and to shift elsewhere on that spectrum as your needs change.

It's a spectrum that encompasses both extremes of the scenarios that a coupled approach and headless approach represent. You can use Drupal to power a single website as we have for many years. At the same time, you can use Drupal to power a long list of applications beyond a traditional website. In doing so, Drupal can be adjusted up and down along this spectrum according to the requirements of your developers and editors.

In other words, Drupal is API-first, not API-only, and rather than leave editors and marketers behind in favor of developers, it gives everyone what they need in one single package.

Special thanks to Preston So for contributions to this blog post and to Wim Leers, Ted Bowman, Chris Hamper and Matt Grill for their feedback during the writing process.

Using BrowserSync with Drupal 8

Posted by Aten Design Group - 25 Apr 2017 at 15:15 UTC

TLDR: Just here for Browsersync setup? Skip to the steps.

I’m always looking for ways to reduce the time between saving a file and seeing changes on the screen.

When I first started working with Drupal, back in the day, I would ftp changes to a server and refresh the page. Once I found Transmit, I could mount a remote directory locally and automatically save files to the server. At the time, my mind was blown by the efficiency of it. In hindsight, it seems so archaic.

Then I started working with development teams. Servers were typically managed with version control. If I wanted to make changes to the server, I had to commit code and push it up. It was time to act like a grown up and develop with a local server.

The benefits of developing locally are too numerous to list here, but the most obvious is not having to push changes to a remote server to see the effects. I could just save a file, switch to the browser and refresh! What was taking minutes was reduced to seconds.

My progression of development practices went something like this.

“Oy, hitting cmd+r is so tedious. What’s this? LiveReload? OH MY GOD! It refreshes the page immediately upon saving! It even injects my CSS without refreshing the page! I’m so super fast right now! Whoa! Less! I can nest all my CSS so it matches the DOM exactly! Well that was a terrible idea. Not Less’s fault but ‘Hello Sass!’. Because Compass! And Vertical Rhythm! And Breakpoint! And Susy! Holy crap, it’s taking 8-12 seconds to compile CSS? Node sass you say?. That’s so much faster! Back down to 3 seconds. Wait a minute, everything I’m using Compass for, PostCSS handles. BOOM! Milliseconds!”

This boom and bust cycle of response times of mine has been going on for years and will continue indefinitely. Being able to promptly see the effects of your changes is incredibly important.

In 1968, Robert Miller published a paper in which he defined 3 thresholds of human-computer response times.

  • .1 sec. To feel like you are directly manipulating the machine
  • .1 - 1 sec. To feel like you are uninhibited by the machine, but long enough to notice a delay
  • > 10 sec. At this point, the user has lost focus and is likely to move onto another task while waiting for the computer.

In short, less than a second between making a file change and seeing the result is ideal. More than that and you’re likely to lose focus and check your email, Slack or Twitter. If it takes longer than 10 sec. to compile your code and refresh your browser, your productivity is screwed.

One of my favorite tools for getting super fast response times is Browsersync. Browsersync is a Node.js application that watches your files for changes then refreshes your browser window or injects new CSS into the page. It’s similar to LiveReload but much more powerful. The big difference is that Browsersync can be run in multiple browsers and on multiple devices on the same LAN at the same time. Each connected browser will automatically refresh when code changes. It also syncs events across browsers. So if you scroll the page in one browser, all your connected browsers scroll. If you click in one browser, all browsers will load the new page. It’s incredibly useful when testing sites across devices.

For simple sites, Browsersync will boot up its own local http server. For more complex applications requiring their own servers, such as Drupal, Browsersync will work as a proxy server. Setup is pretty simple but I’ve run into a few gotchas when using Browsersync with Drupal.

Setting up Browsersync

Local dev environment

First things first, you’ll need a local dev environment. That’s beyond the scope of this post. There are a few great options out there including DrupalVM, Kalabox and Acquia Dev Desktop.

For this tutorial, we’ll assume your local site is being served at http://local.dev

Install Node

Again, out of scope. If you don’t already have node installed, download it and follow the instructions.

Install Browsersync

Browsersync needs to be installed globally.

npm install browser-sync -g

Change directories to your local site files.

cd wherever/you/keep/your/drupal/docroot

Now you can start Browsersync like so:

browser-sync start --proxy "local.dev" --files "**/*.twig, **/*.css, **/*.js”

The above will start a browsersync server available at the default http://localhost:7000. It will watch for changes to any Twig templates, CSS or JS files. When a change is encountered, it will refresh any browsers with localhost:7000 open. In the case of CSS, it will inject just the changed CSS file into the page and rerender the page rather than reloading it.

Making Drupal and Browsersync play nicely together

There’s a few tweaks we’ll need to do to our Drupal site to take full advantage of our Browsersync setup.

Disable Caching

First we want to make sure page caching is disabled so when we refresh the page, it’s not serving the old cached version.

Follow the steps outlined under “Enable local development settings” in the Disable Drupal 8 caching documentation.

Now when you change a Twig template file, the changes will be reflected on the page without having to rebuild the cache. Keep in mind, if you add a new file, you’ll still need to rebuild the cache.

Avoid loading CSS with @import statements

This gotcha was much less obvious to me. Browsersync will only recognize and refresh CSS files that are loaded with a link tag like so:

<link rel="stylesheet" href="/sites/.../style.css" media="all">

It does not know what to do with files loaded via @import tags like this:

<style>
  @import url('/sites/.../style.css');
</style>

This is all well and good as Drupal uses link tags to import stylesheets unless you have a whole lot of stylesheets, more than 31 to be exact. You see, IE 6 & 7 had a hard limit of 31 individual stylesheets. It would ignore any CSS files beyond that. It’s fairly easy to exceed that maximum when CSS aggregation is turned off as any module or base theme installed on your site can potentially add stylesheets. Drupal has a nice workaround for this by switching to the aforementioned @import statements if it detects more than 31 stylesheets.

We have two ways around this.

1. Turn preprocessing off on specific files

The first involves turning CSS aggregation (a.k.a. CSS preprocessing) on for your local site and manually disabling it for the files you are actually working with.

In settings.local.php, set the following to TRUE:

$config['system.performance']['css']['preprocess'] = TRUE;

Then in your [theme_name].libraries.yml file, turn preprocessing off for any files you are currently working on, like in the following example.

global:
  version: 1.0.x
  css:
    theme:
      build/libraries/global/global.css: { preprocess: false }

This will exclude this file from the aggregated CSS files. This approach ultimately leads to a faster site as your browser loads considerably fewer CSS files. However, it does require more diligence on your part to manage which files are preprocessed. Keep in mind, if you commit the above libraries.yml file as is, the global.css file will not be aggregated on your production environment as well.

2. Use the Link CSS module

The easier alternative is to use the Link CSS module. When enabled, this module will override Drupal’s @import workaround and load all files via the link tag regardless of how many. The only downside to this approach is the potential performance hit of still loading all the unaggregated CSS files which may not be a big deal for your environment.

Add a reload delay if needed

In some cases, you may want to add a delay between when Browsersync detects a change and when it attempts to reload the page. This is as simple as passing a flag to your browsersync command like so.

browser-sync start --proxy "d8.kbox.site" --files "**/*.twig, **/*.css, **/*.js" --reload-delay 1000

The above will wait 1 second after it detects a file change before reloading the page. The only time I’ve ever needed this is when using Kalabox as a local development environment. Your mileage may vary.

In addition to the reload delay flag, Browsersync has a number of other command line options you may find useful. It’s also worth noting that, if the command line approach doesn’t suit you, Browsersync has an api that you can tie into your Gulp or Grunt tasks. Here at Aten we use Browsersync tied into gulp tasks so a server can be started up by passing a serve flag to our build task. Similar to the following gulp compile --watch --serve

The peculiarities of content modeling in Drupal 8

Posted by InternetDevels - 25 Apr 2017 at 13:22 UTC
The peculiarities of content modeling in Drupal 8

We continue to describe the functional capabilities of “the big eight” as Drupal 8 has many improvements over its previous versions. Take a look at the innovations in Drupal 8.3.0, which is its latest minor release that came out this month.

Read more

Configuration Management dimensions

Posted by Nuvole - 25 Apr 2017 at 12:07 UTC
Key points from our configuration management sessions.

Unfortunately none of us from Nuvole are attending DrupalCon Baltimore and can’t, therefore, attend the “hallway track” and join discussions in person. We have held presentations about advanced configuration management in Drupal 8 at all Drupal events we attended in the last year including the last DrupalCon in Dublin. So I’ll try to cover some concepts here that could help the discussion about how the configuration management can be improved.

To me there are at least two important dimensions to configuration management in Drupal 8 and different contrib project have sprouted to address:

Vertical: transfer configuration between different environments of the same site.
Horizontal: transfer configuration between different sites.

Following are a few contrib solutions and core issues that address the different itches. This is not meant to be an exhaustive or definitive list but highlight the different problem spaces. Many more contrib solutions address some issues in this space to accommodate various different workflows.

Vertical

Drupal 8 core only addresses this use case, however, there are some important cases not covered (yet). The management of configuration between the different environments is best taken care of by importing and exporting the whole sites configuration together.

Installing from existing configuration:

Contrib solution: Config installer
Core issues: Allow a site to be installed from existing configuration, Allow a profile to be installed from existing config

Environment specific configuration (ie devel only on develop)

Contrib solution: Config Split
Core issues: Allow exported configuration to be environment-specific, Allow development modules to opt out of config-export

Configuration with Content

Contrib solution: Default Content, Deploy

Horizontal

Due to the fact that the same tools can be used for both dimensions and the fact that in Drupal 7 features was abused for doing the vertical configuration management as well this concept may not be so clear. Ideally configuration between sites is shared from the development environment of one site to the development environment of another, and the vertical tools with drupal core are used for deployment. Moving configuration between different sites is done by moving a subset of configuration between environments.

Re-using a set of configuration in another site

Contrib Solution: Features
There are many more modules designed to deal with distributions and dealing with the paradigm that sites now own the configuration.

Inheriting installation profiles

Core issue: Allow profiles to provide a base/parent profile

Multisite with large portion of shared configuration

Contrib Solution: Config Split
While this is not the original problem it tries to solve, it is reported to be used for it..

Conclusion

I may have an obvious bias towards config split since we maintain that module but it is just one of many config related modules. I hope there is a fruitful discussion in Baltimore about configuration management in Drupal 8. BoF

Related blog posts:

Tags: Drupal 8Drupal PlanetDrupalConCode Driven Development

AMP for Drupal 8

Posted by Code Positive - 25 Apr 2017 at 11:00 UTC
lightning

Interested in lighting fast performance for your mobile site? Page's that load pretty much instantly? Here's how the AMP project has pushed boundaries to achieve exactly that.

 

 

Configuring AMP for Drupal 8

Posted by Code Positive - 25 Apr 2017 at 11:00 UTC
tesla coil

Tips and tricks to get AMP working for your Drupal 8 site.

 

 

How I learned to stop worrying and love Drupal contribution

Posted by Manifesto - 25 Apr 2017 at 10:34 UTC
Over the last five years I've worked in a variety of senior Drupal developer positions, but my dirty secret is, I've never really contributed back to Drupal. Until the last few weeks, that is. So, I thought I'd take the time to write about why that is, and what's changed.

AGILEDROP: Drupal Logos with Hats

Posted by Agiledrop.com Blog - 25 Apr 2017 at 08:43 UTC
It's back! You did not really think that our Druplicon marathon was finished, did you? Well, it's not. We still have some areas to explore. After Humans and Superhumans, Fruits and Vegetables, Animals, Outdoor activities, National Identities, Emotions and Human Professions, we have seen that not all Druplicons with hats were covered. So, here are Drupal Logos with hats.   Druplicon with Santa hat     Druplicon with snow hats       Druplicon with ascot hat (Drupal day Bilbao 2014)     Druplicon with fedora (Drupal Camp Buenos Aires 2009)     Druplicon with swimming hat (DrupalCamp… READ MORE

Drupal 8 Views Plugins (Part 2) : The display extender plugin

Posted by blog.studio.gd - 25 Apr 2017 at 08:04 UTC
Let's see how and why to use a views display extender plugin.

Views Plugins (Part 1) : Simple area handler plugin

Posted by blog.studio.gd - 25 Apr 2017 at 08:04 UTC
In this series I will show you how to make use of the new Drupal 8 Plugin system, we begin with a simple example : the views area handler plugins.

Overview of CMI in Drupal 8

Posted by blog.studio.gd - 25 Apr 2017 at 08:04 UTC
Some notes about the new Configuration management system in Drupal 8

Migrate to Drupal 8 from a custom site

Posted by blog.studio.gd - 25 Apr 2017 at 08:04 UTC
Migrate is now included in the Drupal core for making the upgrade path from 6.x and 7.x versions to Drupal 8.

In this article will see how to use the Drupal migration framework to migrate custom sites to drupal 8.

Inline Entity Display

Posted by blog.studio.gd - 25 Apr 2017 at 08:04 UTC
Handle referenced entity fields directly in the parent entity

Drupal 8 Field Layout Alternative to the Display Suite

Posted by OSTraining - 25 Apr 2017 at 01:01 UTC
Drupal 8 Field Layout Alternative to the Display Suite

In Drupal 7 to create custom displays, you would probably use Display Suite.

Drupal 8 just added a potential alternative to the Display Suite in the core experimental modules. The Field Layout and Layout Discovery modules will allow you to assign a layout to specific content types.

8 + 1 Must-Follow Twitter accounts for drupalers

Posted by La Drupalera (en) - 24 Apr 2017 at 21:01 UTC
mejores cuentas twitter drupal

You know everything there is to know about Drupal, but you do not know who to follow in Social Networks? Do not worry, we have the solution to your problems!

Here you have the 10 essential accounts on Twitter that you have to follow to not miss any Drupal news.

10 Twitter accounts you should follow to be a Drupal expert @drupal Cuenta de Twitter oficial Drupal

 

In the official Drupal account you will find latest news and highlights. They are the first to publish releases, important problems and everything you have to be aware if you want to be an updated drupaler.

Read more

Creating a static archive of a Drupal site

Posted by Verbosity - 24 Apr 2017 at 16:57 UTC

Each year another DrupalCamp comes to pass and as event organizers we are left with +1 sites to maintain. After awhile this builds up to a lot of sites that need continuious updates. What to do?

When a site is ready to become an archive it can be a good idea to convert it to a static site. Security updates are no longer necessary, but interactive features of the site disappear... which is usually a good thing in this scenario.

Creating a site mirror

Long before I used Drupal this was all possible with wget, and it continues to work today:

#!/bin/bash
wget -o download.log -N -S --random-wait -x -r -p -l inf -E --convert-links --domains="`echo $1`" $1

I call this script "getsite", you use it by typing "getsite example.com"

This is a simple script that I place in the /usr/local/bin folder of the computer I will be using to create the site mirror.

This script will probably take awhile to run. You can run tail -f download.log in another terminal to watch the progress.

What does it do?

This is a simple web crawler that will follow all links on the page that you provided, but ONLY the links that are on the same domain.

It will try to fetch ALL the assets that come from this exact domain name you provide.

While doing so, it changes all of the paths to be relative to the root.

I also have it set to crawl slowly so as not to scare any firewalls we may be traversing.

You can look up all of the command line options by typing man wget on your system.

After running the command you will have a folder with the name of the domain and all of the files for the site, in addition to a download.log file that you can use to audit the download.

It can be very useful to use the utility tree to see all of the files.

Oh noes! All my paths have .html appended now!

Relax. Just like we can do clean URLs with index.php files we can specify some rules on our webserver to mask that ugly file extension.

In Nginx you can do this as follows:

location / {
  root   /var/www/html
  index  index.html index.htm;
  try_files $uri $uri/index.html $uri/ =404;
}

The "try_files" patterns will match what used to be our Drupal clean URLs.

You may also want to add some kind of htpasswd-style restriction if your content is not intended to be available to the public.

It is as simple as that! Wget is a great utility for making site mirrors or legal archives.

Cleaning up lose ends

Your Drupal site is going to have some interactive components that will no longer work.

In particular:

  • User login form
  • Webforms
  • Commenting
  • Anything else using a form and/or a captcha (maybe disable captcha too)

It may be simpler to disable these before taking the snapshot, or alternatively opening the resulting HTML in a text editor and removing the form components after the fact.

You may also want to enable or disable caching of different things depending on what results you get. By default you are probably going to see a lot of security tokens in the downloaded paths, so you may want to disable that... on the other hand, you may want to bundle your CSS to make fewer requests. Review your downloaded archive to see what will be best before you shut down your source site.

Other uses

My team has used variations of this script for a variety of other needs as well:

  • to estimate the size and scope of a migration project;
  • to get a complete list of paths we may want to alias or redirect after a migration;
  • to make an archive of a site for legal proceedings (ie, gathering evidence of copyright infringement);
  • to migrate data from a static archive when source databases do not contain fully rendered content;
  • and finally: to "pepper" the caches of large sites by hitting each URL after a migration when the caches are all cold.

In that last example we use the spider option to "not" download the files, but simply request them and then move on.

Wget is an extremely powerful tool for mirroring entire sites and provides us an easy way to archive old dynamically-rendered sites without much hassle, and zero ongoing maintenance.

To find out what other things you can do with wget just type man wget on your console and read all the options that are available.

5 Steps to Start Interacting with Drupal.org Issues: Practical Tips for Beginners

Posted by Acquia Developer Center Blog - 24 Apr 2017 at 15:53 UTC
bricks

The word "community" comes from the Latin communis, which means "what is common and shared by many individuals." From the Drupal perspective, the community is nothing more than a group of people looking for the mutual exchange of knowledge about the technology (the detailed definition can be found here).

Tags: acquia drupal planet

Third & Grove presents at Magento Imagine Conference 2017

Posted by Third & Grove - 24 Apr 2017 at 11:05 UTC
Third & Grove presents at Magento Imagine Conference 2017 antonella Mon, 04/24/2017 - 07:05

Pages

Subscribe with RSS Subscribe to Drupal.org aggregator - Planet Drupal