Closed (duplicate)
Project:
Project Issue File Review
Version:
6.x-2.x-dev
Component:
Code
Priority:
Normal
Category:
Task
Assigned:
Unassigned
Reporter:
Created:
18 Nov 2009 at 21:38 UTC
Updated:
5 Feb 2010 at 04:41 UTC
Lets merge discussion of multiple database testing here.
Since there is talk of bringing postgres testing online soon we need one of the following: enough clients to keep up with mysql or only run postgres on Drupal - HEAD testing (triggered on commit).
We can set it so it does not require postgres to pass, but it will not mark a test as having a result until postgres finishes.
Comments
Comment #1
deekayen commentedIf a patch passes MySQL, but not PostgreSQL, what would the message look like on the patch comment? I get that it wouldn't mark the issue "needs work", but can't really picture past that unless it has something like a result line for each environment.
Comment #2
boombatower commentedOn the details page (qa.d.o) it has tabs for each environment.
Currently the summary will just say, passed on all (required) environments. It doesn't say required, but that is the thinking.
There is talk about revamping the summary message that is sent back to d.o: #632212: Report # of passes even on success.
Comment #3
josh waihi commentedIts important for people who are writting patches to know if its PostgreSQL they are breaking. I'm happy for postgreSQL just test HEAD for now. PostgreSQL will be a lot slower than MySQL, it will need more slaves than MySQL.
Comment #4
deekayen commentedCan we first establish that pgsql clients work, that they get a clean test of core, and how long they take to finish a test?
Comment #5
boombatower commentedI need to look into how to cleanly do that. Currently if we add a postgres environment it will want to run, let me take a look.
Comment #6
deekayen commentedI never heard back from Kieran about wiping tdo, so it's still in it's end-stage state from the PIFR 2 upgrade. Maybe it could be a staging server with d6 to test pgsql.
Comment #7
boombatower commentedWe could do that, I'll flit it out of maintenance mode.
Comment #8
boombatower commentedNevermind...server looks dead.
Comment #9
josh waihi commentedI can give you a head up now - PostgreSQL tests take a really long time. There maybe a memory leak somewhere as for some reason PHP needs more memory too. We won't be able to run the tests clean until PostgreSQL actually passes the tests (its not at the moment). I'm working on the show stoppers at the moment.
Comment #10
deekayen commentedHow long is a really long time? An hour?
Comment #11
josh waihi commentedheh, that would be quick - I currently can't finish the test due to memory woes but it taking 8 hours (ish). the fastest I've had it down to is 2.5 hours. I'm hoping a dedicated box will help make it go faster.
Comment #12
andypostI've trying to run d7 under pgsql 8.4 (deb64) but still without success
Comment #13
josh waihi commentedfor starters you should be running 8.1 its the stable version on Debian Lenny. #633678: Make sequence API work on non-MySQL databases needs to be fixed before you can even consider running PostgreSQL tests
Comment #14
andypost@Josh Trying stable version of Debian Lenny with pgsql 8.3: problem mostly in code
UPD: This all because cache-> set() uses db_merge which seems broken for postgres
Comment #15
Crell commentedSubscribing. MergeQuery should be found either way, even if the Postgres version doesn't work. It's in the same file as UpdateQuery and InsertQuery, which presumably do work.
The main slow part of the Postgres tests, when last I was asking about it, was that Postgres DDL operations are really really slow. That means that the 50+ fresh installs of Drupal that unit tests run are going to be much much slower on Postgres than on MySQL or SQLite. Is there any way we can reduce the number of fresh installs? chx mentioned to me once an alternate parent class for testing that could be used that doesn't do a full fresh install, but I forget what. :-)
Also, do we want to bring an SQLite testing server online?
Comment #16
Crell commentedAlthough, to be honest, I'm debating if the current breakout of files in the DB layer is even optimal. Originally I wanted to break off the query builders so that we had less code to load at any given time and shorter files, which helps DX. However, with cache_set() using merge query and the menu system using a built select query (although I'm not convinced it needs to), we're loading all of those files anyway. I don't want to end up with zillion-line files, but should we look into just force-loading more files than database.inc?
Comment #17
deekayen commentedI think it's desirable to bring SQLite online for testing, but not yet practical. Kieran is still tracking down servers enough to support testing contrib on just MySQL and last time I heard, SQLite didn't support PIFR's concurrency, but that was months ago.
Comment #18
andypost@Crell I suppose that bug with merge-query is deeper then I know drupal autoload. Class registry should be still in core, so why it's not working for DB? Maybe class-registry uses db_merge too...
Comment #19
andypostToday tested current HEAD with postgresql 8.3 under deb32 it works fine!
Comment #20
boombatower commentedMore up-to-date issue #697220: Run tests on all supported database servers.