Lets merge discussion of multiple database testing here.

Since there is talk of bringing postgres testing online soon we need one of the following: enough clients to keep up with mysql or only run postgres on Drupal - HEAD testing (triggered on commit).

We can set it so it does not require postgres to pass, but it will not mark a test as having a result until postgres finishes.

Comments

deekayen’s picture

If a patch passes MySQL, but not PostgreSQL, what would the message look like on the patch comment? I get that it wouldn't mark the issue "needs work", but can't really picture past that unless it has something like a result line for each environment.

boombatower’s picture

On the details page (qa.d.o) it has tabs for each environment.

Currently the summary will just say, passed on all (required) environments. It doesn't say required, but that is the thinking.

There is talk about revamping the summary message that is sent back to d.o: #632212: Report # of passes even on success.

josh waihi’s picture

Its important for people who are writting patches to know if its PostgreSQL they are breaking. I'm happy for postgreSQL just test HEAD for now. PostgreSQL will be a lot slower than MySQL, it will need more slaves than MySQL.

deekayen’s picture

Can we first establish that pgsql clients work, that they get a clean test of core, and how long they take to finish a test?

boombatower’s picture

I need to look into how to cleanly do that. Currently if we add a postgres environment it will want to run, let me take a look.

deekayen’s picture

I never heard back from Kieran about wiping tdo, so it's still in it's end-stage state from the PIFR 2 upgrade. Maybe it could be a staging server with d6 to test pgsql.

boombatower’s picture

We could do that, I'll flit it out of maintenance mode.

boombatower’s picture

Nevermind...server looks dead.

josh waihi’s picture

I can give you a head up now - PostgreSQL tests take a really long time. There maybe a memory leak somewhere as for some reason PHP needs more memory too. We won't be able to run the tests clean until PostgreSQL actually passes the tests (its not at the moment). I'm working on the show stoppers at the moment.

deekayen’s picture

How long is a really long time? An hour?

josh waihi’s picture

heh, that would be quick - I currently can't finish the test due to memory woes but it taking 8 hours (ish). the fastest I've had it down to is 2.5 hours. I'm hoping a dedicated box will help make it go faster.

andypost’s picture

I've trying to run d7 under pgsql 8.4 (deb64) but still without success

josh waihi’s picture

for starters you should be running 8.1 its the stable version on Debian Lenny. #633678: Make sequence API work on non-MySQL databases needs to be fixed before you can even consider running PostgreSQL tests

andypost’s picture

@Josh Trying stable version of Debian Lenny with pgsql 8.3: problem mostly in code

Debug Error: \var\www\d7\includes\database\database.inc line 728 - Class 'MergeQuery' not found
  public function merge($table, array $options = array()) {
    if (empty($this->mergeClass)) {
      $this->mergeClass = 'MergeQuery_' . $this->driver();
      if (!class_exists($this->mergeClass)) {
        $this->mergeClass = 'MergeQuery';
      }
    }
    $class = $this->mergeClass;
    return new $class($this, $table, $options); // looks like database/query.inc does not included
  }

UPD: This all because cache-> set() uses db_merge which seems broken for postgres

Crell’s picture

Subscribing. MergeQuery should be found either way, even if the Postgres version doesn't work. It's in the same file as UpdateQuery and InsertQuery, which presumably do work.

The main slow part of the Postgres tests, when last I was asking about it, was that Postgres DDL operations are really really slow. That means that the 50+ fresh installs of Drupal that unit tests run are going to be much much slower on Postgres than on MySQL or SQLite. Is there any way we can reduce the number of fresh installs? chx mentioned to me once an alternate parent class for testing that could be used that doesn't do a full fresh install, but I forget what. :-)

Also, do we want to bring an SQLite testing server online?

Crell’s picture

Although, to be honest, I'm debating if the current breakout of files in the DB layer is even optimal. Originally I wanted to break off the query builders so that we had less code to load at any given time and shorter files, which helps DX. However, with cache_set() using merge query and the menu system using a built select query (although I'm not convinced it needs to), we're loading all of those files anyway. I don't want to end up with zillion-line files, but should we look into just force-loading more files than database.inc?

deekayen’s picture

I think it's desirable to bring SQLite online for testing, but not yet practical. Kieran is still tracking down servers enough to support testing contrib on just MySQL and last time I heard, SQLite didn't support PIFR's concurrency, but that was months ago.

andypost’s picture

@Crell I suppose that bug with merge-query is deeper then I know drupal autoload. Class registry should be still in core, so why it's not working for DB? Maybe class-registry uses db_merge too...

andypost’s picture

Today tested current HEAD with postgresql 8.3 under deb32 it works fine!

boombatower’s picture

Status: Active » Closed (duplicate)