Suffix

Sitemaps in Ruby on Rails

Helping search engines by adding a sitemap to my website.

Sitemaps are a straightforward way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a sitemap is an XML file that lists URLs for a site along with added metadata about each URL (when it was last updated, how often it usually changes, and how important it is, comparted to other URLs in the site) so that search engines can more intelligently crawl the site.

Sitemaps have wide adoption including Google, Yahoo! and Microsoft so I thought it would be a clever idea to integrate this in my website. A few posts later I was tired of updating this XML file manually every time I changed something in the URL scheme so why not pass the task to Ruby on Rails and build the file automatically?

The controller

First, you’ll need a method to collect the data. I coose for the application controller as the sitemap doesn’t really belong anywhere else. The pages controller would be a better choice if you have one that manages all your sites URLs but that’s entirely up to you.

class ApplicationController < ActionController::Base
def sitemap
  @pages = Page.find(:all)
  render_without_layout :template => "layouts/sitemap"
end

The “render_without_layout” part calls the view. My view is in the views/layouts folder but again, this can be anything you want.

The view

As defined in the sitemap method above we need a view that renders the data in an XML file. Create the sitemap XML template in the folder you defined above (views/layouts in my case) and call it “sitemap.rxml”. Now build the structure of the sitemap:

xml.instruct!
xml.urlset('xmlns'=>'http://www.sitemaps.org/schemas/sitemap/0.9',
'xmlns:xsi'=>'http://www.w3.org/2001/XMLSchema-instance',
'xsi:schemaLocation'=>'http://www.sitemaps.org/schemas/sitemap/0.9
http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd') {
  for page in @pages
    xml.url {
      xml.loc("http://" + request.env["HTTP_HOST"] + "/" + page.permalink + "/")
      xml.lastmod(page.updated_at.strftime('%Y-%m-%d'))
      xml.changefreq("weekly")
      xml.priority("0.7")
    }
  end
}

This snippet assumes your page object has a permalink and an updated_at parameter, change these if yours looks different.

There are a few things you need to know about sitemaps: ‘loc’ is the only needed element so you can drop the ‘lastmod’, ‘changefreq’ and ‘priority’ elements if you don't have any useful data for these parameters. More in detail:

See the official sitemap protocol definition site for a full description.

The route

You have a automatically generated sitemap but no way to get there. Tell Rails to call your sitemap in the routes.rb file by adding the following mapping (change the controller if you choose a different one above):

map.connect 'sitemap.xml', :controller => 'application', :action => 'sitemap'

Request your new sitemap with http://www.example.com/sitemap.xml. You may need to restart your Rails server to enable the new route.

The robots.txt

Almost done. The sitemap should be working by now but how does a crawler (like the Googlebot) where to look for your sitemap? That's where the robots.txt file is for. Every crawler should request the robots.txt file first to see what it may or may not index so this is the ideal place to advertise our sitemap. Add the following line to the robots.txt file in your public directory and make sure to use the full URL to your sitemap (including the domain):

Sitemap: http://www.example.com/sitemap.xml

The next time a crawler visits your site it will find the sitemap and index it.

Resources