Hi there,

I was wondering if anyone was hosting multiple real estate websites but only using 1 feed/file repository. I am working on my second real estate site and can't help but think it's silly to download the relevant listings twice, once for each site. Is there any easier way to do this, or a way to download the files to a common folder and then have everyone pull from that collection of data, using filters to show only the relevant agent or office listings?

Just curious if this has been addressed by anyone and if they have any suggestions.

thanks in advance,

Comments

duntuk’s picture

If these are separate owner sites, then your best bet is to setup feeds. Create a feeds export from one site, and then have all other sites import the exported feeds (via feeds import).

This should work fine on updates as well, as feeds is able to keep track of imported content via a unique key (which you would set as the IDX key or MLS#); following only entities that were actually updated would be processed--thereby minimizing the overhead.

I have only tried this with nodes, but I'm guessing it should work with entities as well.

If this a shared account (same owner), then your best be is to use the domain module, and share drealty content across all your sites without having to import it multiple times for each site.

droddis’s picture

Thanks very much for this, great things to investigate. I'll have a closer look and see what I can figure out.

So far we are only bringing in a small number of records, the records for each user as opposed to the entire Nova Scotian MLS feed. Should things change I think I would have to make this change just to ensure I don't destroy the server.

will_kelly’s picture

The one challenge I think you would have is the feed not being secure. That means any site can pull that feed and most MLS's would not allow this in their user agreements.

I think you could accomplish the same thing by pulling all the listings into a single database with drealty and then using a custom module to connect to the main database.

duntuk’s picture

You can always put the feeds in a private location, and have feeds pull it locally.

e.g.


mkdir -p /home/shared/drealty
# create the feeds export, e.g. call the file 'drealty-res-properties.csv', and tell it to use the above dir
# now we set the file to be readable by everyone.
chmod 644 /home/shared/drealty/drealty-res-properties.csv

And, yeah, I was thinking the same thing about pulling it from a shared database. But as you said, that would need to be a custom module (the development of this current module is not very active, so adding this additional feature may be long ways down the road).

In addition, by sharing a database, you have to exclude any write/modification to that database. Where in turn the user loses the ability to customize a listing. E.g. Say the user wanted to set a certain property as "featured" and wanted to upload high res photo (rets photos are like what? only 640px max?). Or wanted to change the description a bit, etc...

However, with this last thing of customization, I'm not sure if this is the best-practice way, as drealty may overwrite customization on rets-import (I haven't tried this yet, but am planning to). I believe I saw a module or a method where entities are imported as nodes--again, I will look into this later.

What I think this project is missing is proper documentation or how-to videos. I'm not volunteering to do this, but the official video for this was a good start, but for whatever reason was never followed up by the author. Personally, I have this module figured out for the most part, however there are still certain "best practice" hurdles that I need to figure out--e.g. Google geocoding 2500 daily limit; there is no native way of limiting this module to respect that.

shauntyndall’s picture

Status: Active » Fixed

Status: Fixed » Closed (fixed)

Automatically closed - issue fixed for 2 weeks with no activity.