This project is not covered by Drupal’s security advisory policy.

Feeds Crawler is a Feeds fetcher plugin for paging through a site or a feed. It is useful for building powerful web scrapers using Feeds.


  • Can paginate url patterns using a start index and number of max results
  • Can automatically find the next link for RSS, Atom, and some HTML sites
  • Can fall-back to a user-provided XPath query for the next link


This module depends on the Feeds module.

Related Modules

Feeds Spider


Project Information