SiteRank is a concept that every SEO should be aware, as it is likely to be a factor in modern search engine rankings. The basic concept is that a high PageRank'd page from a site full of well respected pages will do better than a high PageRank'd page from a site with few well respected pages. Respect in this case equals lots links from lots of different sites with richness of anchor text.
The lesson is that you will do better creating a quality popular site full of lots of content, than trying to game the search engines by SEO'ing your homepage to the max. You can SEO all you want, but when the engines sense a strong case of "link love" those sites will wind up in more SERPs.
What is SiteRank?
If PageRank measures the importance of an individual page, SiteRank measures the quality of a site. Major factors that can measure the quality of a site may include:
- strength of content (size of the related content pages),
- quality of links (links from diversified sites with variant anchor text to many different pages),
- freshness of the content (regular update of content),
- uniqueness of content (less percentage of duplicate content),
- age of the site,
- outgoing links (less percentage of deadlinks and more relevant links), and
- Pagerank.
Why SiteRank and How Does It work?
When Sergy Brin and Larry Page (Google founders) weren't happy about the search results from early search engines (lycos, excite etc.), they tried Latent Semantic Indexing (LSI) to improve the quality of search results. SLI didn't work really well. One of things they noticed that was some one-sentence page ranked #1 for very competitive search terms. So they introduced PageRank from Graphic Theory. PageRank drastically improved the quality of search results and the performance of search. A search engine can serve majority of searches using a small amount of documents. It (may) work like this:
- if the search terms aren't specific, look at pages with higher PR pages (PR4 or higher?) only
- if it can't find enough matching pages, search for pages with lower PRs.
- if the search terms are very specific, search for both higher and lower PR pages.
That was when Google had a few million pages in its index database. Now with billions of pages in the index database, the new heuristic algorithm may work like this:
- if the search terms aren't specific, look at pages from sites with higher SiteRanks (SR4 or higher?) only
- if it can't find enough matching pages, search for pages from sites with lower SiteRanks.
- if the search terms are very specific (more # of search words), search for pages from both higher and lower SiteRanks.
Observation of SiteRank
- If your site get a lot of traffic from keywords that appear only once in a page (not even in title or anchor text), your site has a very good SiteRank.
- If your site get majority traffic from keywords that appear in title and anchor text, your site has a average or reasonable SiteRank.
- If Pages that link to your pages are ranked higher than your page, your site has a low SiteRank.
- Sandbox or Google Penalty - a very low SiteRank. Why adding many garbage words to your search terms can turn off Sandbox? Remember: "if the search terms are very specific (more # of search words), search for pages from both higher and lower SiteRanks".
- Google Ban - a zero SiteRank. Google de-indexes a whole domain, not a sub-domain, not or a few directories. The algorithm work at site level.