I have used migrate to move content from a legacy DB to a new Drupal system rather often. Assuring all content ids are migrated. It's great!

I have a new project with lots of generic documents and the clients envision adding ~5,000 documents daily from an unknown source formats (email, documents, phamlets, databases, csv, xml) into a master document type in drupal.

I cannot think of key(s) to create for this data for Migrate to work against it -- and rather the client doesn't want to have to create fancy keys to track source(s) or give every item a unique key within its source set. Eg, 1 day there could 5000 email entries to ingest; another day 5000 poll entries, the next day 5000 text transcripts from podcasts, the next day 5000 more emails (unrelated to our first batch of 5000 emails a few days ago) ...

So is there a sane setup for Migrate where batches of content can be ingested; but not care about tracking highwater numbers or the source IDs forever with the migration. So I mean essentially massive 1-off imports.

So I haven't done dynamic imports much before, is this case for basically a dynamic import? But again I dont care for migrate to track the history of these import (drush ms should not list the last 20 whatever import attempts I've made).

Is this outside the scope of the migrate module? I'd opt to use Feeds, but it seems really really slow and I'm very comfortable with Migrate and it's community tooling.

Comments

tenken created an issue.