CommonFloor.com is one of India’s leading real estate portal, offering home seekers, sellers and real estate professionals an extensive online real estate ecosystem. Apart from a robust search mechanism, the portal also maintains a vast repository of well-researched content on a host of property related matters ranging from legal issues to maintenance, facility management and home insurance among many others. One of the main problems they were facing was how to repurpose the content across property data – their core product – as well as making it available over different platforms e.g. iOS/Android where user experience is quite different from desktop devices.
Initially, the client team was a bit reluctant to use Drupal to serve the content. The then repository was governed by Wordpress which gave them very less flexibility to work on. The content strategy was changing within the organisation which the framework failed to deliver. This compelled them to utilize a SOA architecture to have more control over the content to be shown.
They were more inclined towards Wordpress for which they had an internal team available. But once they saw the proof of concept using Drupal as backend and Angular for consuming the content, they were really impressed with how easy & fast it would be for the team to create new services as well as customize existing services.
The bout between the two contenders was won by Drupal mainly because of the modular architecture of Drupal which fulfilled the flexibility they required. The scalable, extensible environment of Drupal and out-of-the-box features of the Services module - which became the main pillar of the project - convinced them to use Drupal as their platform of choice for their content repository application.
CommonFloor.com decided to keep content related to real estate research separate from their core website. The real estate research content could be either articles, images, videos, audio and structured reports. The goal of the CMS system is to provide an end to end workflow systems for real estate research team to create different types of content, review content for accuracy and completeness, tag content (for SEO and context) and publish content on CommonFloor.com website. Providing a separate CMS (that is, decoupling it from the core CommonFloor.com) website will ease the maintenance, provide additional security and system can be made flexible so that it can extended to suit any need of the real estate research team. The CMS will provide APIs such the contents can be fetched from CMS to display on multiple devices (desktop and mobile phone). The APIs were optimized for different devices so that it returns either the entire article with images, provides only snapshots of certain article, returns tagged article, biased article. Based on the device, different APIs will be called.
The CMS has to be decoupled from core CommonFloor.com by hosting the CMS on a separate server (or domain cms.commonfloor.com). Commonfloor will access the CMS via the APIs provided for the published contents once they are approved by the Commonfloor editorial team.
The CMS need to have a complete end to end workflow that will enable content writer to write articles, structured reports and upload images/videos. Once created, the writers can submit these articles so that reviewers can review the content for accuracy and completeness. If content was found to be incorrect, the reviewers can reject the content so that the writers can correct it and resubmit the content. The reviewers can submit the content for SEO specialists to review content, provide SEO tags and publish the content directly to the CommonFloor website.
Per User Dashboard
Based on the user persona and permission, the dashboard only serves content that they have access to. The users can view and manage content from the dashboard. The CMS will also provide notifications to all personas in the systems when a content is submitted, rejected and published. The dashboard will allow them to view contents, edit contents, submit/publish the content and view notifications if there are any changes in content status.
The existing portal was running on Wordpress. So a data migration was required to transfer all the contents from Wordpress to Drupal, along with their respective SEO tags and categories. Apart from contents, migrating the user migration was also a priority.
Custom REST APIs to serve data
In order to meet the complexity of the data requests, custom REST APIs were created using the Services module. The Services module provides the hook (hook_service_resources) to create custom RESTful APIs which was used extensively ditching the Services View module. This was necessary for the better command over creating the requested APIs. This provided specific endpoints for the requested data.
For Service authentication, the Services API Key Authentication was used. Using this we were able to provide an access key token for the requested JSON requests and match the access key with the client secret key. This was a perfect example of Oauth implementation.
One of the challenges faced in this was the response time of the requested JSON data. But with Drupal Cache API we were able to cache the data and pull down the response time drastically. The approach was simple; creating the output array and using cache_set we cached the whole data for an amount of time depending on the frequency variation of the requested data.
Wordpress to Drupal Data Migration
The existing portal in Wordpress had a huge volume of data that needed to be migrated to the new Drupal 7 site. Since Drupal 7 already has a module for Wordpress to Drupal migration, content migration was completed with ease including the tags, categories, comments and users. However, the module doesn’t support migration of data from the custom fields in Wordpress which was being used to hold the metadata of the contents. So, to migrate the custom field, a custom script was written to parse the xml generated for migration,then extract the metadata and attach it to the appropriate contents.
An Easy-to-Use Working Environment
Since the portal was going to be heavily used by content writers and editors, the theme was kept simple. As there was no anonymous user facing front-end, a single theme was used across the site as both admin theme and default theme to maintain the consistency of design. Panels was used heavily to create customized layouts for content viewing and node forms.
Customized User Dashboard per Role
User dashboard per User role was one of the key factors used by the editors of the site. Workbench Moderation was used for the content moderation. Based on the workbench moderation dashboard the user dashboards were built. Additional moderations stages were added and according to that views were built which served as the moderation tabs in the dashboard.
These tabs were then surgically tied to the specific user roles responsible for the content moderation of that stage.
The Content Auto tagging module was an innovative feature that we built which allowed the author to fill the tags and category fields on submission of the node. This experience was rather kept simple yet the backend architecture was not effortless. The module read the title and body of the node and matched the term names in the category/tags list; if it matches the taxonomy were filled up.
There was one more instance where this functionality was required for a bulk autotagging of the migrated contents. We batch processed the contents and applied the same formula to tag the contents.
Auto tagging suggestions - Module provides the functionality to fill autocomplete fields from other text fields like body or title. All available terms in selected fields will be filled in the chosen destination term field.