At least initially, we're not planning to use the reporting of the load testing tool itself - that's fine if you are load testing application + hardware, but these are application tests only, so we only need something that can generate load (which must include being able to have a session and submit forms etc.). If it has nice reporting on top, that may be useful later.

Stuff we need:
- You should be able to write and run load tests locally, a bit of setup is OK but not too much.
- either needs a test recorder or a very easy scripting language.

Possible choices:

- can build test scripts without any code using their plugin which is supported
- allows real browser testing - could add front end / browsermob style testing on top later
- Has an API for writing tests in different languages other than the Selenium one.
- There is already Selenium -> Simpletest code written.

- can't easily run headless.
- very closely tied to browsermob for load testing, not keen on relying on third party services.

- has a pretty active community that is working on this stuff
- can both drive real browsers and do 'headless' testing.

- Tests are written in Ruby, meh.
- no centrally supported test recorders (although there are some projects that do it, no idea how well they work).

- runs on node.js, there are people in the Drupal community using node already.
- Tests are written in JavaScript, those of us who know js should be able to adjust easily, those of us who don't can get some practice in...

- very new.
- no test recorder, but the API looks nice and simple - would rather have a simple API than a nice load test recorder personally.
- only does headless testing (but this is by design, maybe that's OK).

Other things to consider:

JMeter - there's already some generic Drupal load tests written. - not really in favour of having our on PHP load test runner but it is there.

Possibly others - please add them.

I'm far from decided yet, but am leaning towards zombie.js for simplicity.


beejeebus’s picture


i think zombie.js is a good choice.

we also need to decide where to put the code that builds the content for different test plans.

for now, i'll have a go at some zombie tests that login and create some content.

Owen Barton’s picture

I have been chatting with Patrick Meenan, who manages, and is now working at Google on their web performance project. In addition to the free service, WebPagetest is also available as an open source (BSD) package. This integrates with real browsers (including IE), and has detailed instrumentation and waterfall charts on front end performance, which IMO is a huge win relative to the other options suggested so far.

It also has a nice web API which allows you to use the site, submit forms etc, and Ryan has been working with various vendors in the web performance space to try and standardize on API's (which might be nice for Drupal hosting companies who could then use the same test plans on browsermob to load test their platform).

Patrick kindly set me up with an API key we can use on, so we could use that directly early on, and set up our own WebPagetest environment later on.

catch’s picture

I took a look at webpagetest, the scripting language looks simple enough and the feature set is nice, fully agree it'd be good to get instrumentation for real browsers as part of this (or very easily added).

The main issue with that the tests have to be run from 32-bit Windows -

In practice that's going to mean we're only really going to run these via either the service or centrally managed machines. That's going to make it tricky for people to write and run tests locally while actually developing the test cases.

However that's probably a trade-off we'll have to make with anything that runs real browsers more or less, so it really depends.

watir apparently can run headless, but it looks like that's additional setup and configuration rather than less at the moment, it looks like integration of watir with front end instrumentation is in progress but not necessarily mature yet.

We could also try to build the actual load test queuing (which I've been assuming we'd do in Jenkins) so it can trigger tests run from more than one load test runner - that'd mean we could support completely separate sets of headless vs. real browser tests using different frameworks. That commits us to two rather than one though which is not so great either.

catch’s picture

Forgot to mention, I tried to get started with zombie.js last night but got stuck at installing it via npm.

Looks like I wasn't the only one

If the install had actually gone smoothly it would've been < 5 minutes (including node and npm), but if it's going to be buggy like that for everyone, that's a mark against it I think.

Justin also mentioned he ran into problems running the load tests with cookies (i.e. login was completely failing).

beejeebus’s picture

http request flow like this:

1. zombie.js --> drupal, hits 'user/login': returned headers from drupal have no session cookie

2. zombie.js --> drupal, POST to 'user/login': returned headers from drupal have a session cookie, but also a redirect

2.a) this is when i need code to run in zombie.js to grab the session cookie headers, save them, and setup the zombie browser to send the cookies on subsequent requests.

3. zombie.js --> drupal, follow redirect to 'user/$uid': fails, gets a 403 back, because i wasn't able to do 2.a)

cookie management is not automatic, so you need to grab cookie headers, save them, then load them for subsequent requests.

dalin’s picture

I'm coming into this discussion a little bit late, but it sounds like we are trying to accomplish two very different things with the same tool and I'm not sure that it will work out. While things like Webpagetest and Selenium (I haven't tried the other tools) can give fairly good front-end performance stats, I'm skeptical about their ability to measure back-end performance. If an HTTP page takes 200ms to generate, and the commit that we are testing adds 20ms to page load time, are these tools going to be able to see that degree of granularity?

catch’s picture

There's some background missing to this issue, currently the only summary of the general plan for this is at

The short version of that is that we aren't planning to use the measuring aspect of any of these load testing tools for back-end performance (we might show it if it's there, but it won't be the main source of data). The 'load test runner' is only going to be used for actually simulating load on the application itself (and hopefully we can get front-end performance stats without too much work on top).

For back end performance, we've discussed the following:

- put apache, MySQL etc. into cgroups - they allow for measuring resource usage for distinct sets of functionality and should be consistent even between hardware. Damien suggested this initially and Narayan has done some initial work on a puppet config.
- record + aggregate + provide a user interface for xhprof runs.
- NewRelic has offered a free account to the Drupal community (I just have to e-mail them to get it set up) for this.
- probably other things like MySQL slow log (millisecond patch + log queries not using indexes etc.)

We're hoping we can get enough detail from these individually + combined, and present it OK, so that we don't need to rely on the reporting of the load test tool at all.

catch’s picture

msonnabaum pointed me to capybara:

This is ruby again, but appears to be the only thing that can run drive both headless (including zombie.js) and real browser runners.

The tests look like this:

Scenario: Deleting an article
Given an article exists
When I go to the article's page
And I follow "Delete"
Then I should see "Article deleted"

Looks like we could get headless testing with a fairly simple to install driver like zombie.

Then real browser testing with instrumentation (not sure how the instrumentation bolts in but presumably it just needs a driver for something that does it)?

And it could also handle javascript acceptance testing, of which we have zero at the moment.

Owen Barton’s picture

If we used Capybara with Selenium, we could potentially use ShowSlow ( to collect some basic metrics from browser plugins (YSlow/Pagespeed/Dynatrace). It does not capture the same level of detail as WebPagetest though. It is also not very straightforward, as we would need to build a method to correlate submissions to ShowSlow to individual test results after the fact, rather than having they reported back to the caller after requesting a test run (which is how WebPagetest would work). We would also need to build our own reports, as the ShowSlow ones won't work for us (mostly oriented to showing changes over time).

While I can see the attraction of allowing people to develop tests fully locally, I am not sure this should have a huge sway on our tool selection. As long as people had a publicly accessible sandbox, we could easily give them the ability to use a instance of WebPagetest to test their load scripts. Also, I presume we would have our own t.d.o-like infrastructure to enable the easy testing of a specific patch (and actually record cgroup data etc etc) - this same infrastructure could be used to test changes in load scripts too. This means users wouldn't be able to test their load scripts on (most) air flights or jungles/deserts, and they would need access to some kind of publicly accessible site (could be the free for life thing from DC Chicago). I don't feel like either of these are major challenges for most of the people who would be writing/maintaining load scripts.

ygerasimov’s picture

I have also seen project that by description runs lightweight headless webkit web browser. I have no idea how to write tests yet but as it is close to Selenium WebDriver project (that is used in it might be compatible in the end. But we should try and check what it is in the current state.

attiks’s picture

Linking back to some related issues:
#1482982: Run QUnit SimpleTests
#1489382: Test Swarm: Alternatives - Complements

Also have a look at testswarm it supports scenarios like outlined in #8