Google’s guidelines to SEO – Concise version
To help its webmasters, Google has published a 32 page SEO guideline (You can download here). A guide from the search giant itself must be read by every web developer, but 32 pages are too long, right? Hence to save your time, here is the compilation of what the guideline says.
Unique page title for every page
Page titles help the users and the search engine know what is inside the page. Also Page title is what the search engine shows in its search result. Hence creating a unique page title for every page is helpful.
Make the title tag as descriptive but brief as possible.
“Description” meta tag
Except from the page title, Google also displays some text below it in the search result. This text is either the description of your site from the Open Directory Project (dmoz.org) or some content of your page that is relevant to the query or the description provided in the meta tag.
It is a good practice to have a description of your page that gives a summary of the page content. This helps the user and the search engine know what your page is about. Again unique, descriptive and brief meta tags are a good choice!
Improve the structure of URLs
Simple and descriptive URLs not only help you organize your file system but also give more information to search engines and users. Also URLs with descriptive words help search engines to crawl to your pages more easily. Other than just SEO, it also helps other users to connect to your page using Hyperlink more easily. For e.g.
Wikipedia has a format:
but instead if it was something like: http://www.en.wikipedia.org/wiki?articleid=12345
would it be easier for you?
Refer SearchEnabler’s on-page analysis for optimizing all tags & comparison with top ranking URL’s.
Site map is helpful when the users are having trouble navigating through the site. The guide suggests you to create 2 site maps. One for users and one for search engines. Search engines use it to get good crawl coverage over your page. Google has helped make an open source tool to create a site map for your site.
The link is here: http://code.google.com/p/googlesitemapgenerator/
The guide suggests HTML site map for users and an XML file for search engines. Submitting a sitemap to search engines makes work easier for them.
User reaches a 404 page when he tries to go to a page that does not exist on your server. It is always helpful to have a custom 404 page having a link to the site root (home) page if not anything else. You should also avoid letting the search engines index the 404 pages.
“Interesting sites will increase their recognition on their own” the guide very rightly says. All the discussed points will not be as effective as good quality content. There is a reason Wikipedia is everybody’s choice. Sites that are loved by people find it easy to rise high in the search results. Search engines do not favor sites that have copied content.
Anticipating differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Avoid keeping your text in images as search engines can’t read the text within the image.
Optimized use of images
Like said in the about point, do not present text in the form of images. But there will hardly be a web page without any image. If image is a prime part of your content then, it is useful to use the ‘alt’ attribute of the image to give a proper description of the image. Also the guide suggests you to give an image sitemap to the googlebot.
Robots.txt is a file that gives information to the search engines information about access to some parts of the file. It tells the search engines whether or not you can access a part. It has to be named robots.txt and it must be placed in the root folder of your directory. A sample is shown over here:
Keep in mind that all the sub domains of your site need to have a separate robots.txt. We must keep one thing in mind; disallowing the search engine from crawling to sensitive content will not be enough. As there will be rogue search engines (yes, they exist!) that will not obey the access rules. Also some search engines will not crawl to the pages but will show the links without any title or content. Also, some users may open your robots.txt and try to access these files.
One more thing about robots.txt is password protecting directories using .htaccess.
“Nofollow” to spammers
Adding rel=”nofollow” to a link tells the search engine not to go to the link. Whenever a search engine goes to any page linked by your site, the performance of that page affects the ranking of your site aswell. Hence it is wise to not let the search engines crawl over to spam sites. You can automatically add rel=”nofollow” in all the anchor tags present in users comments or any other user generated data. Several sites do this. Otherwise you will have to manually do that.
Not all sites support mobile versions. Generally the googlebot automatically figures whether or not the site is compatible for mobile viewing and based on that indexes your site. You can specifically tell that your page is for mobile viewing by adding this on the top of your page.
Make sure you guide the users to correct version of your site. If a user is coming through a desktop browser or it is googlebot, you should redirect them to the desktop. The mobile user or googlebot-mobile should be redirected to mobile version. Make sure that desktop version of a product page goes to mobile version of the same page.
- There should be proper use of heading tags. Use heading tags to display the structure of the content. Do not use heading tags anywhere or everywhere. Avoid using heading tags where tags like <em> and <strong> can be used.
- Promote your website in social media in the best ways! We have already posted blogs about best promotional techniques for social media. They will be very helpful for this.