Simple DCAT export

Purpose of this module

DCAT is becoming the de facto metadata format to describe "datasets" (e.g. easily processable CSV, XML, XLS files) on open data portals.
However, copying the (meta)data from "normal" websites to open data portals is often a manual process, which is both time-consuming and error-prone.

The purpose of this module is to extract medata (title, description, links) from existing nodes and generate a "DCAT" file that can be processed by open data portals and crawlers, without the need for specific content types or "heavy" modules.


An importing/migration module that separates remote data retrieval from Drupal Entity creation/updating by a queue.

Following in the lines of feeds, with some concepts from the migrate module, this module could grow to complement both, by becoming a source of data for either. The point of this module is to separate the migration of Drupal entities by first retrieving source data & metadata, and keeping that in queue items. The items can then be used for CRUD operations on Drupal entities.

Feeds JSON Parser

Provides an Feeds JSON parser.

Tumblr Migrate

This is a Migrate module for importing a Tumblr blog into Drupal.

I experimented with Feeds first, but even with Feeds Tamper, special field handling (e.g. URLs included inside of HTML) necessitated extra tweaking and code. Since the Migrate module accesses data more directly to begin with (and automatically produces a resuable migration module) I switched to Migrate.

Tumblr XML Sources

I used two feeds to pull data, since Tumblr’s API made some fields more accessible than others (Audio) or provided them inconsistently (Title).

Migrate Example MSSQL

Example for MSSQL content migration

Migrate Alias

Migrate url alias for node, users, and taxonomy


Subscribe with RSS Subscribe to RSS - Import/Export