Drupal is up and running but how do I ...?

Unified authentication across different sites?

Hey all,
So, say I have four sites, site1, site2, etc. They are all subdomains of the same domain (so site1.domain.com, site2.domain, etc.) Now, they all have different database prefixes but a shared user database (so registering at site1 allows you to also be registered at site2, etc.).

How to: "Allow individual users to customize the visibility of this block in their account settings."?

I read everywhere that individual users can customize the visibility of individual blocks in their account settings, but I cannot find this option in my Drupal site's user account settings page.
What am I missing? Do I have to enable something?

Thanks

Lukas

Managing comments

Does anyone know of an easy way to convert a comment and any dependent comments to a new thread? Discussions sometimes get unwieldy, people set off in new directions, and it would be great to be able to prune branches and plant them elsewhere. I've done this directly by editing the database but it's a pain in the neck.

How to make a Site access only from Primary Links

Hi - How i make a Site accessable only from Primary Links Menu? Not to appear at Navigation and not to appear at Startpage in the Text-Fields (Artikle, Storys etc.)

soundbites

path module, duplicate URIs in search indexes and robots.txt

I use path and pathauto module in my blog so that the URIs are human-friendly. When path module is used each post(node) has 2 URIs, 1 of the form http://example.com/node/<node-id> (I will call this the normal URL) and the other created by pathauto (lets call this the human-friendly URL).

Usually this is not a problem since the only the human-friendly URL is exposed (on the sitepages or through the sitemap) and search engines crawl only those URIs. However on my site somehow the search engines stumbled upon the normal URIs too with the result that many nodes (not all) are listed twice in the index. This is not a good thing since seach engines usually penalize for duplicate content appearing on a site. This might be interpreted as blackhat SEO.

I am thinking of putting in a line in my robots.txt file that disallows crawlers from crawling URIs of the form http://example.com/node/*. Over a period of time the normal URIs should be eliminated from the indexes.

Is this a working solution? Does it have any side effects I am not aware of? I would like some advice from more experienced people out there before I follow through with this.

I was also wondering how this happened? Any ideas? My hunch is that not all modules use human-friendly URIs and these might have exposed the normal URIs. For e.g. the print module generates URIs that are derived from the normal URIs.

Thanks. Have a nice day.

Acidfree album has large greay boxes around images

I have installed the Acidfree module and managed to get it all running. I don't understand why when looking at the album, it has large gray boxes around each one of the images - seems like a lot of wasted space. Is there a way to turn this off?

You can see an example of the behavior here: http://www.look-up.com/avimages

Thanks!
df

Pages

Subscribe with RSS Subscribe to RSS - Post installation