Reasons for using the batch API:

1) Running all tests takes a lot of time. The batch API was designed just for this
2) It would be more user-friendly to see progress than to wait idly for a page to load
3) Doing this in multiple HTTP requests would be less resource-intensive.
4) Plus it would be really awesome.

I have attempted to alter SimpleTest to use the Batch API, but I could not seem to get it to work. Perhaps someone with more experience with the Batch API could write this/help me with this?

CommentFileSizeAuthor
#129 fix-run-tests-243773-129.patch2.77 KBdmitrig01
#127 fix-run-tests-243773-127.patch5.5 KBpwolanin
#123 simpletest-batchapi-243773-122.patch82.83 KBchx
#114 simpletest-batchapi-243773-114.patch83.39 KByched
#110 simpletest-batchapi-243773-109.patch84.59 KBcwgordon7
#109 simpletest-batchapi-243773-108.patch84.87 KBdmitrig01
#105 simpletest-batchapi-243773-103.patch77.18 KByched
#101 simpletest_batchapi_243773_100.patch79.35 KBcwgordon7
#102 simpletest_batchapi_243773_100.patch79.35 KBcwgordon7
#100 simpletest-batchapi-243773-100.patch68.16 KByched
#97 simpletest-batchapi-243773-97.patch68.28 KByched
#90 simpletest-batchapi-243773-90.patch67.77 KBboombatower
#86 simpletest-batchapi-243773-86.patch71.28 KBwebchick
#85 simpletest_batchapi_85.patch68.2 KBchx
#84 simpletest_batchapi_84.patch62.76 KBchx
#82 simpletest_batchapi_882768_82.patch44.5 KBdmitrig01
#80 simpletest_batchapi_882768_80.patch47.62 KBdmitrig01
#79 simpletest_batchapi_243773-77.patch33.08 KBchx
#79 webchick_will_love_this.png12.11 KBchx
#77 simpletest_batchapi_243773-76.patch32.53 KBchx
#75 simpletest_batchapi_243773-75.patch32.29 KBchx
#71 simpletest_batchapi_243773-71.patch10.13 KByched
#63 simpletest_batchapi_243773-63.patch9.77 KByched
#62 simpletest_batchapi_243773-62.patch9.77 KBchx
#61 simpletest_batch_callbacks.txt1.62 KByched
#60 simpletest_batch_callbacks.txt1.43 KByched
#55 simpletest_batchapi_243773-54.patch9.82 KBchx
#54 Picture 3.png35.68 KBwebchick
#53 simpletest_batchapi_243773-52.patch9.66 KBchx
simpletest_batchapi_243773-51.patch10.54 KBchx
simpletest_batchapi_243773-50.patch10.55 KBchx
simpletest_batchapi_243773-49.patch10.56 KBchx
#48 simpletest_batchapi_243773-48.patch9.49 KBchx
#32 simpletest_batchapi_05f.patch10.54 KBDamien Tournoud
#27 simpletest_batchapi_05e.patch10.71 KBDamien Tournoud
#23 simpletest_batchapi_05d.patch9.92 KBDamien Tournoud
#21 simpletest_batchapi_04.patch6.87 KBcwgordon7
#20 batchapi-simpletest-3.patch7.15 KBDamien Tournoud
#19 batchapi-simpletest-2.patch7.22 KBDamien Tournoud
#18 batchapi-simpletest.patch4.78 KBDamien Tournoud
#17 simpletest_batchapi_03.patch6.27 KBcwgordon7
#13 simpletest_batchapi_02.patch4.35 KBcwgordon7
#12 simpletest_batchapi_01.patch4.31 KBcwgordon7
Support from Acquia helps fund testing for Drupal Acquia logo

Comments

boombatower’s picture

This was mentioned by moshe weitzman in http://drupal.org/node/239565#comment-785646.

If someone wants to write a patch I think it would be a very nice addition, but I will focus on getting all the tests written, passing, and API fixed.

moshe weitzman’s picture

It would be cool, but really the Web UI is pretty lousy for real developers. i am expecting most developers to run tests from the command line. I am using drush. The drush simpletest runner still needs some features, but it works for now. There are no timeout issues with CLI php.

cwgordon7’s picture

But for the most part, those who will be testing Drupal patches will not be hard-core cli developers: they will be using the web interface. Why not make it as good as possible?

boombatower’s picture

Patches should eventually be automatically tested. Although I suppose that is, for now, only planned for core patches.

Either way, I use the web interface, and I consider myself hard-core. :)

Since Drupal is a web development platform I think a nice web interface makes sense. Obviously since it can be done, as moshe weitzman has pointed out, it isn't of the utmost urgency. If someone would like to make a working patch that would be great.

boombatower’s picture

Status: Active » Closed (won't fix)

This was decided against, in part due to the fact that no one was sure how to do it.

cwgordon7’s picture

"Was decided against" by whom?

And in any case, would a working patch not change the situation completely, so shouldn't this not be marked "won't fix" but rather as a "feature request" instead of a "task"?

boombatower’s picture

Project: Drupal core » SimpleTest
Version: 7.x-dev »
Component: simpletest.module » Code

Sure.

I would like to see some action within the next several weeks. SimpleTest has too many rotting issues.

Note: it was mentioned several times at the sprint.

boombatower’s picture

Project: SimpleTest » Drupal core
Version: » 7.x-dev
Component: Code » simpletest.module
Category: task » feature
Status: Closed (won't fix) » Active

Moving.

dropcube’s picture

Project: SimpleTest » Drupal core
Version: » 7.x-dev
Component: Code » simpletest.module

subscribing... and offering to review/write a patch later.

webchick’s picture

Title: Use the Batch API instead of the current all-in-one approach » SimpleTest: Use the Batch API instead of the current all-in-one approach

Clarifying issue title slightly.

cwgordon7’s picture

Assigned: Unassigned » cwgordon7

Assigning back to self

cwgordon7’s picture

"Progress"

cwgordon7’s picture

Status: Active » Needs work
FileSize
4.35 KB

Ok, several strange bugs occur:

Sometimes drupal_bootstrap() now fails randomly.

Sometimes upon completion of the batch API it inexplicably redirects to admin/build/modules and spews fun errors.

Also, SimpleTest tests now fail. I'm not sure exactly how to tackle it, because as has been previously reported, SimpleTest browser is unable to handle the Batch API..

cwgordon7’s picture

Oh and I neglected to mention that it for some reason thinks there are two more steps than there actually are, giving strange UI results.

webchick’s picture

subscribing

cwgordon7’s picture

Plus search ranking tests now give 2 fails... ?

cwgordon7’s picture

Updated patch, still a few bugs.

Damien Tournoud’s picture

FileSize
4.78 KB

Just to keep the ball rolling, here is a refactoring of cwgordon7 with cleaner use of the batchapi.

Regression: you can't run all tests with that version yet.

Damien Tournoud’s picture

FileSize
7.22 KB

New progresses!

My previous idea of just invoking test cases test-by-test was not very natural. Here is a patch that introduces TestSuite::runOneByOne().

Damien Tournoud’s picture

FileSize
7.15 KB

I need to learn to save before making patches... Here is a cleaner version.

cwgordon7’s picture

Latest patch has several coding standard issues, new patch attached. Still has many issues though, and I won't have time to work on this over the next few weeks, so more eyes/hands welcome!

cyberswat’s picture

subscribing

Damien Tournoud’s picture

Ok, here is a new version of the patch that really works.

I ran the whole test yesterday with an earlier version. I'm going to ran it once again, but because this is slow, fell free to do it too!

webchick’s picture

Status: Needs work » Needs review

So is this the right status now? :)

Damien Tournoud’s picture

The results are in. And this looks like a success:

4422 passes, 49 fails and 26 exceptions.

Failed modules: Blog API, Filter, Path, SimpleTest (but a fix is on its way), System ("You may not block your IP address"), Upload.

And well... Statistics succeed!

boombatower’s picture

Status: Needs review » Needs work

Slick!

So looks like we need simpletest.test to be updated for sure, but we can't be 100% on other tests, or can we.

Blog API: ready patch #260778: Add SimpleTest user agent information to drupal_http_request
Filter: ready patch(es) #266539: Filter tests fail
Path: still broken in core to my knowledge
SimpleTest: should be broken by this patch (so needs updating)
System: I believe is related to #256886: Internal browser and ip_address() inconsistent
Upload: Possibly #253506: contact.test broken (was upload.test)

So definitely wait for this patch to include update to simpletest.test, but do we want to wait for others?

Looks like they are all marked as RTBC so if we get those committed then we can confirm this doesn't break those tests. That would only leave path which is broken.

And we would have all core tests passing except one!?

Damien Tournoud’s picture

Status: Needs work » Needs review
FileSize
10.71 KB

Yeah! simpletest.test: Test cases run: 1/1, Passes: 30, Failures: 0, Exceptions: 0

Damien Tournoud’s picture

Ok, please review the latest version of the patch. It looks like we are not breaking anything here.

boombatower’s picture

Status: Needs review » Needs work

Hmm...

When I run simpletest.test I get

28 passes, 2 fails and 4 exceptions.

Found assertion {"SimpleTest pass.", "[Other]", "Pass"}. [Other] Fail
Found assertion {"SimpleTest fail.", "[Other]", "Fail"}. [Other] Fail
Unexpected PHP error [Undefined index: assertions] severity [E_NOTICE] in [/home/jimmy/software/php/drupal/modules/simpletest/simpletest.test line 97] [PHP] Exception
Unexpected PHP error [Invalid argument supplied for foreach()] severity [E_WARNING] in [/home/jimmy/software/php/drupal/modules/simpletest/simpletest.test line 97] [PHP] Exception
Unexpected PHP error [Undefined index: assertions] severity [E_NOTICE] in [/home/jimmy/software/php/drupal/modules/simpletest/simpletest.test line 97] [PHP] Exception
Unexpected PHP error [Invalid argument supplied for foreach()] severity [E_WARNING] in [/home/jimmy/software/php/drupal/modules/simpletest/simpletest.test line 97]

Anyone else confirm this?

boombatower’s picture

Status: Needs work » Reviewed & tested by the community

Sorry, looks good I forgot to apply new patch. :0

So do we wait for #26 or try and commit?

cwgordon7’s picture

Status: Reviewed & tested by the community » Needs work

Ok, "Simpletest is initializing" should definitely be "SimpleTest is initializing", so cnw. Plus a few odd bugs, see http://drupal.org/node/267333.

Damien Tournoud’s picture

Status: Needs work » Needs review
FileSize
10.54 KB

This new version solves cwgordon7's concerns in #37, and fix a small bug I introduced in the previous patch.

Damien Tournoud’s picture

I ran a global test before and after the patch. Results are interesting.

- means "before the patch", + means "after the patch".

Only the differences are shown. Methodology: run all tests before and after the patch.

Global
-4404 passes, 67 fails and 5 exceptions.
+4423 passes, 48 fails and 5 exceptions.

Node revisions
-51 passes, 1 fails and 0 exceptions.
+52 passes, 0 fails and 0 exceptions.

Poll create
-27 passes, 7 fails and 0 exceptions.
+34 passes, 0 fails and 0 exceptions.

Search engine ranking
-23 passes, 3 fails and 0 exceptions.
+25 passes, 1 fails and 0 exceptions.

Vocabulary functions
-4 passes, 9 fails and 0 exceptions.
+13 passes, 0 fails and 0 exceptions.
boombatower’s picture

Very interesting indeed.

When I run all it still dies after 43%.

cwgordon7’s picture

This improvement can be expected since this solves the stale statics problem. :)

@boombatower: can you isolate a test where it's failing?

yched’s picture

Great work !
A few nitpicking code remarks :
- + $batch = $form_state['values']['use_batch'];
Variables named $batch usually contain an actual batch definition. I think the name $batch_mode is used in other places in core for this kind of boolean flag. (Same goes for the last argument of function simpletest_run_tests() )

-

+      $operations = array();
+      $operations[] = array('_simpletest_batch_operation', array(serialize($tests), serialize($reporter)));

could probably be merged in one direct array declaration (if not directly in the $batch declaration ?)

- function _simpletest_batch_operation()
the $tests and $reporter params are only used as initial values, subsequent iterations take the values stored in $context['sandbox'].
It might be worth making this explicit in the param names ($tests_init, $reporter_init) ?

+  if (($size = $tests->getSize()) != 0) {

$tests->getSize() is used a few lines above in the progress message, so the variable could probably be defined earlier (getSize is a simple accessor, so no performance involved here, only code consistency)

cwgordon7’s picture

Status: Needs review » Needs work

Agreed with yched #36, so cnw.

catch’s picture

Status: Needs work » Needs review

Unpatched: 4485 passes, 55 fails and 5 exceptions.
Patched: 4503 passes, 37 fails and 5 exceptions.

yay!

Combined with 5 or so RTBC patches which make tests pass one way or another:
4531 passes, 11 fails and 4 exceptions. (!)
/offtopic

Works great, progress reports look good, strings also good. I'm not sure why there's a checkbox to suppress the batch output though, seems a bit redundant. Also, minor point, but do we need to use batch API when there's only one test case to run? I seem to remember elsewhere in core it kicking in only over a certain number of operations.

catch’s picture

Status: Needs review » Needs work
cwgordon7’s picture

The checkbox is there so the simpletest internal browser can use it... perhaps we should hide it with javascript?

cwgordon7’s picture

Also when testing this patch please note the critical batch API bug: #267333: Batch API values round down., which should be applied in order to avoid very strange and disconcerting errors.

catch’s picture

Couldn't we check for user agent directly rather than use javascript? This is there only so simpletest can test itself?

catch’s picture

Additionally, this causes one failure on the very last assertion in tracker.test

boombatower’s picture

I think we narrowed it down to the page test in the node group.

I've tried re-installing Drupal HEAD and still get it.

catch’s picture

Status: Needs work » Needs review

One more thing - why does this patch test_case.php - afaik we don't fork SimpleTest do we?

catch’s picture

Status: Needs review » Needs work

sorry, mistaken status change.

chx’s picture

catch, we already forked simpletest beyond recognition and the end of the road is getting rid of it completely.

chx’s picture

Status: Needs work » Needs review
FileSize
9.49 KB

Addressed concerns in #36 and #42. #42 was quite forcefully addressed as I have removed the checkbox altogether :)

chx’s picture

I unpublished all attempts not to hack simpletest because they broke stuff badly. We need to patch test_suite, sorry.

Edit: my attempts.

chx’s picture

Added more inline documentation for the new methods.

webchick’s picture

Priority: Normal » Critical
Status: Needs review » Reviewed & tested by the community
FileSize
35.68 KB

Wow, this represents an important usability improvement for SimpleTest module. Upgrading status accordingly.

Now, instead of waiting 30 minutes for your browser to quit spinning its wheels, you are entertained by a lovely status bar, with a notice of what test is processing, and a countdown of the number of items remaining, and total completion percentage....while you wait 30 minutes for your browser to quit spinning its wheels. ;D See attached screenshot for a demonstration, those who haven't reviewed it yet.

Of course, the icing on the cake would be updating the table of results after each batch, ala http://jquery.com/test/. chx has informed me that this patch is a necessary stepping stone to that type of functionality.

Running this on all tests I received fails/exceptions on the following:

Aggregator: 149 passes, 3 fails and 0 exceptions. #249154: Aggregator tests fail when existing content is present. - known issue
Blog API: 35 passes, 10 fails and 5 exceptions. #259988: BlogAPI test has an exception blocked by #260778: Add SimpleTest user agent information to drupal_http_request - known issue
Contact: 190 passes, 4 fails and 0 exceptions. #253506: contact.test broken (was upload.test) - known issue
Filter: 163 passes, 12 fails and 0 exceptions. #161217: URL filter breaks generated href tags - known issue
Node: 245 passes, 0 fails and 8 exceptions. #260497: Node tests have exceptions - known issue
Path: 76 passes, 4 fails and 0 exceptions. #260490: Path tests fail blocked by #237336: Url Aliases no longer work - known issue
Statistics: 32 passes, 0 fails and 4 exceptions. #260495: Statistics module tests yield exceptions blocked by #255613: SimpleTest: Update clickLink() to use drupalGet() and clean-up code - known issue
System: 151 passes, 1 fails and 0 exceptions. #267812: System test has failures - known issue
User: 164 passes, 5 fails and 1 exceptions. #267813: User tests have failures - known issue

I was also concerned that the amount of code changed around in simpletest/test_case.php was worth some documentation, especially given that it wasn't our code to begin with. chx covered that already as part of #53.

So, from my end, this looks good to go. RTBC.

chx’s picture

I added passes, fails and exceptions to the batch API message. I even diffed my patches (yikes) to ensure this is the only change to #52. Does not change any behaviour at all so I am leaving at RTBC. Adding the whole report is the definitely the next patch but I could not resist sneaking this one in here :)

catch’s picture

Even better, this is lovely.

Dries’s picture

Status: Reviewed & tested by the community » Needs work

Mmm, it doesn't work for me. After a while, the simple tests block: the bar stopped making progress. Also, as you can see on the screenshot, the test results have been reset. They stopped accumulating

Looking at the code, I noticed that you removed a fair amount of Reporter stuff -- did you make sure there is no other left-over code?

catch’s picture

Dries, did you only get failures on all tests, or running individual/small groups of tests as well?

alpritt’s picture

Tested with one, multiple and finally all tests. When running all test it failed by hitting my max_allowed_packet setting of 1M. Upping it to 2MB solved the problem. Error is below.

Got the same output as webchick along with 0 passes, 0 fails and 3 exceptions for XML-RPC

Here's that error.

An error has occurred.
Please continue to the error page
An error occurred. /head/batch?id=7&op=do {"status":true,"percentage":50,"message":"Processing tests.\x3cbr\/\x3eProcessed test \x3cem\x3ePoll vote\x3c\/em\x3e (remaining: 29 of 58). 3006 passes, 33 fails, 13 exceptions."}
Warning: Got a packet bigger than 'max_allowed_packet' bytes query: INSERT INTO watchdog (uid, type, message, variables, severity, link, location, referer, hostname, timestamp) VALUES (1, 'php', '%message in %file on line %line.', 'a:4:{s:6:\"%error\";s:12:\"user warning\";s:8:\"%message\";s:1571789:\"Got a packet bigger than 'max_allowed_packet' bytes\nquery: UPDATE batch SET batch = 'a:10:{s:4:\\"sets\\";a:1:{i:0;a:11:{s:7:\\"sandbox\\";a:4:{s:8:\\"progress\\";i:29;s:3:\\"max\\";i:58;s:8:\\"reporter\\";s:841103:\\"O:14:\\"DrupalReporter\\":21:{s:13:\\"_output_error\\";s:0:\\"\\";s:14:\\"_character_set\\";s:10:\\"ISO-8859-1\\";s:12:\\"_fails_stack\\";a:2:{i:0;i:0;i:1;i:0;}s:17:\\"_exceptions_stack\ in /Users/alanpritt/Sites/sites/mine/drupal.head/www/includes/database.mysqli.inc on line 130

yched’s picture

Final minor nitpick :
Since the batch multipass op uses $tests->getSize(); to decide wether it needs another iteration or not, $context['sandbox']['progress'] is actually not needed.

About the max_allowed_packet error :
After each batch request, the whole $batch array (including the context for the currently running op) gets serialized and stored in the {batch} table, and is retrieved/unserialized at the begining of the next request.
Problem is that on each iteration, the current state of $reporter gets *appended* to $context['results'], which is then growing exponentially, ultimately hitting max_allowed_packet limit.
In short,

+    $context['results'][] = $context['sandbox']['reporter'];

should be

+    $context['results'] = $context['sandbox']['reporter'];

(unlike most other batch API use cases, where we store a piece of 'result' only relevant to the current iteration).

This still leaves us with two copies of $reporter fetched / stored on each iteration : one in $context['sandbox']['reporter'], one in $context['results'] (this is one edge case, because $reporter is both a temp object for the multipass computation, and the results themselves).
One copy should be enough, though, and since, in the current design of the batch API, the 'finished' callback (here _simpletest_batch_finished(), whose role is to make the results available out of the batch processing cycle) only receives the 'results' and not the 'sandbox' (rightly so, but I won't delve into this), I think we can ditch the copy in 'sandbox' and use only 'results' to store $reporter.

Not being currently where I can roll a patch or actually test it :-/, attached is a text file containing the fixed (untested...) versions of _simpletest_batch_operation() / _simpletest_batch_finished(). Theoretically, it should make running all tests less time consuming...

yched’s picture

Slightly modified version of the above _simpletest_batch_operation() / _simpletest_batch_finished() file :
- added some code comments
- removed the now unneeded if ($size) { test (the math still holds if $size is 0)

chx’s picture

Status: Needs work » Needs review
FileSize
9.77 KB

I rolled in yched's changes.

yched’s picture

D'oh - fixed stupid code order error (I was trying to read $tests->getSize() before $tests got initialized).

(same patch as #62, hand-edited to swap two lines in _simpletest_batch_operation()...)

alpritt’s picture

I'm getting the same packet size error on patch #63.

catch’s picture

Still works fine for me. max_allowed_packet on my desktop is set to 16M though.

yched’s picture

@alpritt : hmm. The error you mentioned in #59 happened after "Processed test \x3cem\x3ePoll vote\x3c\/em\x3e
(remaining: 29 of 58)." - did the new patch at least let you run the tests further before hitting the error ?

alpritt’s picture

$context['results'] still grows every time a test completes, which I guess is eventually still pushing it over the 1MB limit. But doesn't that have to be the case? Otherwise how would we be able to report the full results at the end? It doesn't appear to be exponential now, just additive.

Since tests run in smaller batches successfully, perhaps it is fine to say that max_allowed_packet needs to be set a little larger. It runs successfully on 2MB which is hardly huge.

Unless we only store the failures and exceptions and not the passes. Do we always need to print out all the passes on the results page? I personally don't find them helpful; in fact they get in the way of reading the errors. Could this be a place where we slim things down?

webchick’s picture

maybe we need to solve #267894: webchick's dream as part of this patch after all. ;)

If we printed the output to the screen iteratively, we'd only be increasing PHP's ram usage, not the results going back/forth to MySQL.

I know that the output is a form now, but I don't think it has to be... the only form-like things here are fieldsets and those could be put in as straight HTML.

I might take a stab at this this afternoon.

alpritt’s picture

@yched: Actually it stopped one earlier – at 30 remaining. I've checked this multiple times and it is always 29 remaining with patch #55 and 30 with patch #63.

Dries’s picture

I have the exact same problem as alpritt. The tests lock up after "processed test PHP filter functionality" (47% completion).

I know we identified the source of the problem, and that we're exploring solutions. That said, isn't this also a problem with the batch APIs error handling? The fact that the tests seemingly run forever, is something that we could handle better (regardless of the fix)?

yched’s picture

@Dries : If I'm not mistaken, alpritt did not mention the batch looping forever, but breaking with the relevant error message after hitting "packet bigger than 'max_allowed_packet'". This I can reproduce if I set my max_allowed_packet value to 1Mb. See below for possible fix.

I *think* at some point during my tests I also saw the '# of passes / fails / exception go down instead of accumulating' behavior you mention, but processing always eventually reached 100%. I did not try to confirm or investigate yet (but if anything, it might be more related to the simpletest end of things, which I'm less familiar with). I'd be interested to have access to a setup where processing actually hangs without any error notice, as you described.

@alpritt : it's strange that, while the latest patch stores *much less*, it still breaks at the same point or earlier.
Attached patch fixes this by compressing the serialized $reporter using gzdeflate / gzinflate before it gets stored in the {batch} table. It is not something the batch API does by default for the whole $batch array - not sure all batch API use cases are worth the overhead, but I guess this could be discussed. Meanwhile, the patch uses bin2hex() / pack() to store binary data in the 'LONGTEXT' {batch} table column.
From my tests, the total size of the stored $batch data with 'compressed' $reporter doesn't exceed 200Ko when running all current tests, which leaves us some place...

Another fix (probably better / requiring more refactoring in simpletest.module and classes) would be to somehow only store the 'raw results' of the tests while the batch is running, and generate the actual HTML at the very last moment, once all tests were run. Carrying this fairly large amount of html / FAPI arrays during the whole batch processing seems rather sub-optimal. I'm currently not familiar enough with simpletest entities to decide whether this is doable, though...

Dries’s picture

Just like webchick has a dream of being able to stream the test results, I have a dream of being able to distribute the test tasks across multiple cores and, ideally, multiple machines. This will become a requirement for me at some point, especially if the number of tests double, we get more rigorous about security testing and I want to run all tests before and after each CVS commit. Or, what if you have a site with 30 contributed modules and you want to run all their tests as well?

I want to make sure we're not painting ourselves in a corner with using a session based storage mechanism. Are we postponing the pain or?

I'm wondering if we can't come up with an alternative storage mechanism that is known to be robust.

chx’s picture

Assigned: cwgordon7 » chx
Status: Needs review » Needs work

I am rewriting simpletest not to use the simpletest library at all but our own code, and use batch API. The results are saved into a database table. This is based on dmitri's work in http://drupal.org/node/252517 but uses the DB. This opens up a lot of possibilities we have not had before. The current state is that tests run, save their results in the DB and there is a very very raw report too -- the latter is like a proof of concept, not a working reporter. https://4si.devguard.com/svn/public/simpletest/a_new_hope is where you can find this code. Expect a patch within a day or two. There is less Drupal specific code to maintain after this change because drupal_reporter.php and drupal_test_suite.php are completely gone. We maintain the same interface so there is no work wasted :) Not to mention the horde of simpletest libraries which are going. The simpletest library was our booster rocket -- but now we are flying and they are not needed any more.

Adding whatever reporting should be trivial, like #267894: webchick's dream .

Now, let's fix the tests that are broken, webchick has listed them all.

chx’s picture

We are definitely getting there. The reporter now works. I would like to smooth out a few quirks before posting a patch but those who are interested are encouraged to check the code in svn. So far, we have about 7.7% less code in the Drupal specific parts (66876 vs 61756 bytes) and 100% less code in the simpletest part :) To fulfill Dries' dream, I will put the tests to be run into a database table... might be a follow up, though.

chx’s picture

Status: Needs work » Needs review
FileSize
32.29 KB

First patch. There is not enough reporting during the batch run, but otherwise it's working nicely. Please do not forget to credit Dmitri too, he helped a lot. Still less Drupal specific code and still no breakage though I only tested with Node, will test all.

@Dries: this patch is about batch API. Writing an alternative runner that can run multiple processes is fairly trivial because these two lines run a test class:

      $test = new $test_class($test_id);
      $test->run();

where $test_id is an integer which identifies the test runs that we want to reported together.

cwgordon7’s picture

db_query("INSERT INTO {simpletest} (test_id, test_class, status, message, message_group, caller, line, file) VALUES (%d, '%s', '%s', '%s', '%s', '%s', '%s', '%s')", $this->test_id, get_class($this), $status, $message, $group, $function['function'], $function['line'], $function['file']);

... where is this {simpletest} table supposed to be coming from?

chx’s picture

Many improvements: messages got an id so that assertions stay in order. Database schema added.

webchick’s picture

Every single function needs to be (thoroughly) PHPDoced. This is Drupal now, so there's no excuse.

chx’s picture

Doxygen will come... but here is something...

dmitrig01’s picture

Teh latest version

alpritt’s picture

Status: Needs review » Needs work

Fresh install. Individual tests work (not tried them all though). Multiple tests at once work. Running all tests produces the following error:

[edit: this is patch #80]

An error occurred. /head/batch?id=2&op=do
Fatal error: _simpletest_batch_operation() [<a href='function.-simpletest-batch-operation'>function.-simpletest-batch-operation</a>]: The script tried to execute a method or access a property of an incomplete object. Please ensure that the class definition &quot;unknown&quot; of the object you are trying to operate on was loaded _before_ unserialize() gets called or provide a __autoload() function to load the class definition in /Users/alanpritt/Sites/sites/mine/drupal.head/www/modules/simpletest/simpletest.module on line 367

dmitrig01’s picture

Here's the next patch. It's incomplete, and I'm asking cwgordon7 to fix it.

yched’s picture

Unable to test right now, A few remarks :

- Unless I'm mistaken, $test_id stays the same throughout the multiple batch iterations, so the "$test_id_init / $test_id = $context['results']" dance doesn't seem needed anymore, we can directly take the incoming value in _simpletest_batch_operation()'s params.
(Looks like storing it in $context['results'] is still needed, though, so that it can be retrieved in _simpletest_batch_finished(). )

- Instead of fetching messages for all the tests that were run so far on each iteration, the current sum of passes / fails / exceptions could be stored in $context['sandbox'], and simply updated by adding the values relevant to the current test.

- I'd find it more instructive if the # of passes / fails / exceptions displayed in front of each test name was relevant to *that* test instead of cumulative, but that can probably be argued. We could have the overall #s displayed on the second line of results (below the "Processed test %test (remaining: @count of @max)" message ?)

chx’s picture

FileSize
62.76 KB

Here we are. simpletest.test needs to be reworked still but it's now a trivial matter because there are results (YAY!) to be parsed. This was hard. The trick is variable_set('simpletest_test_id', $this->test_id); when setting up and then $test_id = isset($_SESSION['test_id']) ? $_SESSION['test_id'] : variable_get('simpletest_test_id', 0); when reporting. I worked more than a week on getting to this point so I would be glad if the followups would include a patch, either fixing something I am unaware of or best would be a simpletest.test that works. If you have other things to say then keep it until the patch is CNR.

chx’s picture

Status: Needs work » Needs review
FileSize
68.2 KB

New patch! simpletest.test now passes. cwgordon7 added awesome new css to batch api. dmitri and i added a TON of doxygen and comments. Compared to 84 this is a much simplified, cleaned up version.

webchick’s picture

This patch fixes a couple bugs I found while testing. The fixes come from chx. Please do not credit me in the commit message.

yched’s picture

+$this->banana = $banana; ? :-)

catch’s picture

It's looking good.

I get the following failures from running all tests, a couple seem to be introduced by this patch, most aren't. Output from the tests looks lovely.

Fails both patched and unpatched (this is a good thing):
BlogAPI (no change, known issue)
Filter (no change, known issue)
Forum (no change)
System (no change, known issue)
Path (no change, known issue)
User (no change)

edit: New but a broken test:
PHP

Fails due to the patch:
Contact
9 fails vs. 4 unpatched - the four existing errors are still fixed by http://drupal.org/node/253506. The 5 additional errors are from line 141 of contact.test

So just those five errors in contact.module, leaving this at needs review for some more eyes.

webchick’s picture

+$this->banana = $banana; ? :-)

CRAP. :P I was introducing an exception to see how the system handled it. ;) Someone remove in the next patch, please. :D

boombatower’s picture

I removed the banana line and ran all the tests.

The status message that says what test case was just completed doesn't update all the time, just twice for me.

This is looking great otherwise, but I think we should get the outstanding issues with several of the tasks that already have patch RTC committed first so we can confirm that this patch isn't causing any additional issues.

boombatower’s picture

After applying all the patches listed above that would apply I get the following results (some of the patches need to be committed and this patch re-rolled).
Fail:

BlogAPITestCase (1 exception)
  Undefined property: stdClass::$is_date	Notice	xmlrpc.inc	70
ContactTestCase
ForumTestCase (1 exception)
  Duplicate entry '1' for key 1 query: INSERT INTO simpletest777042vocabulary VALUES (1, 'Tags', 'Tags are used to group your articles into different categories.', 'Enter a comma separated list of words.', 0, 0, 0, 0,	User warning	database.mysqli.inc	130	
NodeRevisionsTestCase (8 exception)
  Undefined property: stdClass::$vid	Notice	node.module	918   (8 times)
PathTestCase
PHPTestCase

The BlogAPI exception relates to #245961: XMLRPC value_calculate_type throws exception.

The NodeRevisions exception is in node_save()

elseif (!empty($node->revision)) {
  $node->old_vid = $node->vid;
}

The processed status only ends up displaying the following list of test cases:

AddFeedTestCase
UpdateFeedTestCase
UploadTestCase
UserRegistrationTestCase
UserValidationTestCase
boombatower’s picture

Status: Needs review » Needs work

Based on the processed status not updating correctly I would say this needs work, otherwise we need to work on getting the RTC patches in core and re-rolling this patch.

catch’s picture

One more patch to apply for the forum.test breakage (sorry everyone): http://drupal.org/node/261869

Status updates worked perfectly for me every time I ran tests (FF3 on XP), fwiw.

boombatower’s picture

Another fix is good...tests are doing their job...found numerous issues this way. :)

I get the pass/fail with stats displayed in bullets I just don't get the Processed .... (5 of 4,000,000 remaining) updated correctly on FF 3 SUSE 11.
The y of x remaining works, but the last run test case doesn't update.

catch’s picture

Status: Needs work » Needs review

Just to note I get both 'remaining x of y' and the percentage displaying fine on my system. Will try on ff3/ubuntu later to see if I can reproduce there.

Did you get 11 errors or 4 on contact? Since we might need to track down which browsers/OSes this is a problem on, I'm going to mark back to needs review for some more testing.

yched’s picture

Status: Needs review » Needs work

About the "Processed test fooTest..." message :
there's actually a bug, the name of the test is always the last test in the list below (which is sorted alphabetically, not in order of actual execution). That's because '%test' => $class should be '%test' => $test_class in the line $message = t('Processed test @num - %test)....

Patch coming, along with a few changes related to my comments in #83

yched’s picture

Status: Needs work » Needs review
FileSize
68.28 KB

Updated patch :
- addresses the first 2 points in my comment #83 above
- fixes the progress message errors reported by boombatower, thus setting back to CNR
- slightly changes how the new 'css' batch property is processed : the batch API lets several form submit handlers add their own batch 'sets', executed sequentially (even though actual use cases should be pretty rare). Since the progress page itself gets loaded only once in JS mode, we should include the css for all of them, not just the first one.
- updated PHPdoc for batch_set() with the 'css' property

Open questions :
- The {simpletest} and {simpletest_test_id} tables are currently never emptied, and simply grow.
Do we want to truncate them with the 'clean environment' button ?
And / or on cron (entries older than x days, using an additional 'timestamp' column) ?

- We select on {simpletest} using a GROUP BY status (was test_class, status in chx's patch #83). Doesn't this deserve an index on the 'status' column ?

- Instead of querying the {simpletest} table to get the number of passes / fails in the test that has just run, wouldn't we be better off with $test->_fails, $test->_passes counters ?

cwgordon7’s picture

Status: Needs review » Needs work

Note that there is a change in behavior here: we now only setUp() and tearDown() once per test class, whereas before it was done for each test* method. As a result, tests such as contact and php may now fail - the solution is to move the test* methods that require their own databases into their own classes. This also clarifies the design pattern a bit - before, it was a bit unclear whether you should have many test classes or one test class and many test methods. Contact.test and php.test fixes to come.

cwgordon7’s picture

Status: Needs work » Needs review

cross-posted

yched’s picture

I gave a try to my last remark in #97.
Attached patch stores test results in $test->_results = array('#pass' => 112, '#fail' => 10, '#exception' => 0), avoiding the query on {simpletest} in _simpletest_batch_operation() (and the need for an index on the 'status' column).

There might be good reasons chx went that way, though, and he might object...

cwgordon7’s picture

Attached patch incorporates yched's #97 and separates out contact and php tests.

cwgordon7’s picture

Attached patch incorporates yched's #97 and separates out contact and php tests.

boombatower’s picture

Only thing, which I have said before, that is annoying about classes instead of methods is the redundant getInfo() data. My preference is still methods, but the change is fine.

Since that constitutes a somewhat major API change I think it is very important for the other patches to be committed so we can confirm that this doesn't cause any extra issues.

catch’s picture

yched’s picture

#100 and #101 crossposted - attached patch merges the two.

cwgordon7’s picture

Sorry about the cross-posting.

@boombatower - how could this cause extra issues? We've identified the tests with more fails, and even if one has escaped our notice, it is a piece of cake to move the test function into its own class. So, not seeing the issue there, I think this is still at cnr, unblocked.

chx’s picture

I am fine with less queries. I do not think either those issues are blockers.

boombatower’s picture

Even if you don't consider them blockers they are simple patches that have been RTBC for awhile and should be committed and this patch re-rolled. Obviously you can do it either way, but a large part of the tests is to find bugs so having them passing before making changes just fits the philosophy.

Either way this is great work, thanks to all involved both testing/review and patching.

dmitrig01’s picture

I fixed the CLI script

cwgordon7’s picture

Minor update to #108, in which the script was critically broken.

catch’s picture

Ran all tests via the web interface and got this:

Blog API functionality: 35 passes, 10 fails, 1 exception - known issue, RTBC patch
Site-wide contact form: 120 passes, 4 fails, 0 exceptions - known issue, RTBC patch
Core filters: 48 passes, 12 fails, 0 exceptions - known core issues, RTBC patch
Forum functionality: 386 passes, 0 fails, 1 exception - known core issue, patch
Path alias functionality: 66 passes, 4 fails, 0 exceptions - known core issue, RTBC patch

Since this is essential for running all tests at once without failures caused by static variables, I'd strongly support committing it regardless of the other pending patches - since otherwise it's necessary to run each test individually if you want to absolutely ensure that a patch doesn't break anything - which is resulting in new breakages as time goes on.

I've only taken a quick look through the patch, but there's lots of documentation and it generally looks nice and tidy. Looks ready to me but would be nice to get one more review.

yched’s picture

from #97 :
The {simpletest} and (less importantly) {simpletest_test_id} tables are currently never emptied, and simply grow.
Do we want to truncate them with the 'clean environment' button ?
And / or on cron (entries older than x days, using an additional 'timestamp' column) ?

I'd vote for both, since the 'clean environment' button is advertised as 'when something went wrong'.
(Obviously, this can go in a followup patch if committing this one fast is important to progress on the tests front)

catch’s picture

@yched, definitely truncate on 'clean environment', cron sounds good too. I think this could be a followup patch though as you say.

yched’s picture

Heck, that's only a few lines, we might as well do it now...
Updated patch cleans the {simpletest} on cron for entries older than 24 hours, and truncates the whole table when hitting 'clean environment'

R.Muilwijk’s picture

This patch is really a help for people who are active with testing. Being able to see the status of your tests is just great!
For a follow up it might be a nice addition to add grouping for module in the status, maybe like this:

Aggregator: Add feed functionality: 24 passes, 0 fails, 0 exceptions

instead of

Add feed functionality: 24 passes, 0 fails, 0 exceptions

For now this patch is such an addition to Simpletest that it would be very nice to get it committed soon. Follow ups can be done later.

I run the tests and didn't find anything strange. Installation is also still working fine.

chx’s picture

I recommend going with #110. Cleanup needs more thought. Edit: it's also feature creep to an already *huge* patch. More though, like after each run, clean up automatically only if it was successful, otherwise clean up on cron but be VERY careful not to clean up a running test! Given how long those can, this needs care -- followup patch.

catch’s picture

Status: Needs review » Reviewed & tested by the community

Went through this with R.Muilwijk on IRC and I also agree that #110 is good. We can add cleanup and small features like module names next to batch reporting in followups - but they aren't necessary to fix current test failures, which the rest of the patch is.

R.Muilwijk’s picture

Status: Reviewed & tested by the community » Needs review

My tests were run on patch #110. Just run the tests for #114 which breaks everything. Clean up should like chx said be tackled later and we should go with patch #110. Btw I ran all tests for #110 and #114.

R.Muilwijk’s picture

Status: Needs review » Reviewed & tested by the community

Hmmz guess I had the form on screen when catch was commeting too... back to reviwed en tested

cwgordon7’s picture

Status: Reviewed & tested by the community » Needs work

I would like to include #114 and #115 in this patch as they are both trivial and add to the awesomeness.

chx’s picture

Assigned: chx » Unassigned

I am done here people add and add stuff until it no longer works! Why? Tell me why!

cwgordon7’s picture

Status: Needs work » Reviewed & tested by the community

chx has convinced me, #'s 114 and 115 can go in a followup, and this remains rtbc.

chx’s picture

Status: Reviewed & tested by the community » Needs review
FileSize
82.83 KB

Fixed assertPattern and fixed UI too as per Dries' request: removed # after Line in the reporter table and changed the ongoing indicator to Processed test @num of @max - %test.

chx’s picture

Assigned: Unassigned » chx
Status: Needs review » Reviewed & tested by the community

Crossposted.

webchick’s picture

Just want to confirm that I've run this patch through the paces, and apart from the bug that was fixed in #123, I found no issues. Also tested the upgrade path.

Running all tests, the failures were (all of which have a pre-existing issue in the queue):
Blog API functionality: 35 passes, 10 fails, 5 exceptions
Site-wide contact form: 120 passes, 4 fails, 0 exceptions
Core filters: 48 passes, 12 fails, 0 exceptions
Node revisions: 64 passes, 0 fails, 8 exceptions

Because:
1. This patch fixes some tests that didn't previously pass
2. It actually makes SimpleTest tolerable for running all tests at once
3. It makes testing FUN! (really!)

I'm a proponent of committing this patch first, and then getting the other issues mopped up after wards.

I'll also note that this is a complete 360 reversal of my previous position: the patch is that good.

Dries’s picture

Assigned: chx » Unassigned
Status: Reviewed & tested by the community » Fixed

Committed to CVS HEAD. Thanks all. This was a heroic effort. :)

pwolanin’s picture

Status: Fixed » Needs work
FileSize
5.5 KB

The last commit to scripts/run-tests.sh (labeled a roll-back) broke it completely.

Getting it back to how it was in rev 1.2 makes it work partially (--list is broken), which is what the attached patch does.

pwolanin’s picture

Actually, all the tests are broken for me on the CLI, though it's working for dimitri, so that needs more investigation. Also, the #! indicator of php path got changed to MAMP's

dmitrig01’s picture

Here's the fixed version.

catch’s picture

Status: Needs work » Needs review
catch’s picture

Status: Needs review » Needs work

3 out of 5 hunks fail.

chx’s picture

Status: Needs work » Fixed

Followup issues are followup, post them elsewhere please.

pwolanin’s picture

follow-up issue for CLI script: http://drupal.org/node/274794

Anonymous’s picture

Status: Fixed » Closed (fixed)

Automatically closed -- issue fixed for two weeks with no activity.