I love collecting lists of the top marketing tools. Below are some of my favorite.
- Create a site with unique, rich content.
- Ensure that both content and page markup are well-formatted and have no errors.
- Google will naturally favor bigger brands, and push them to the top of search results.
- This means that site branding is more important than ever.
- The ‘keywords’ meta tag is never used by Google for determining search ranking.
- Have 5-10 specific keywords or keyword phrases been identified to increase targeted traffic?
- Have Keywords been analyzed for search popularity?
- Have selected keywords been associated with specific pages on the site?
- Have selected keywords been inserted in content with 7% keyword density?
- Are keywords leveraged in URL folders and file names, including image files?
- Is there a Keyword / Key phrase in URL?
- Is there a Keyword / Key phrase in Domain Name?
- Is there a Keyword / Key phrase in Title Tag?
- Is there a Keyword / Key phrase in Meta Description Tag?
- Is there a Keyword / Key phrase in Meta Keyword Tag?
- Is there a Keyword / Key phrase in H1, H2 and H3 Tags?
- Is there a Keyword / Key phrase in Bold and Strong Tags?
- Is there use of a Keyword / Key phrase in content?
- Is there prominence of Keyword / Key phrase in webpage code?
- Is there a Keyword / Key phrase in Alt text for images?
- Is there use of Keyword / Key phrase in Anchor text?
- Are we using hyphens to separate keywords in URL’s?
- Are Keyword Density and Keyword Prominence employed?
- Does the site properly use a robots.txt file?
- Does the site have an XML-based Sitemap deployed?
- Does the site have a text-based, standard sitemap?
- Does the site avoid serving duplicate content on multiple URLs?
- Are 301 redirects in place for content residing in alternate URL domains?
- Does the site limit user input forms required to access content?
- Are Query-string URLs avoided throughout the site?
- Does the site consist of 5 or fewer subdirectory levels?
- Do site URLs generally avoid parameter strings in excess of 4 variables?
- Is Flash-only content avoided?
- Is there a good Internal Navigation tree like structure?
- Does HTML Sitemap link to all pages?
- Are there any RSS Feeds?
- Have we validated coding of webpage’s as per W3 standards?
- Are we compressing the file size of web page?
- Do we avoid excess use of Java Script?
- Are we promoting use of CSS for coding?
- Are we using off page CSS and Java Scripts?
- Are we avoiding long URL’s?
Off-Page Ranking Factors
- What’s the age of page from were back link is coming?
- Is there frequent change in Anchor Text?
- Popularity of referring page
- What’s the position of the link?
- Is site listed in Dmoz.org?
- Is site in Yahoo Directory
- Is site listed in Wikipedia
- Is site listed in other reputed directories
- Have we submited site to classified directories?
- Are we getting back links from Social Media sites?
- Have we submited frequent articles to article directories?
- Have we submited press release frequently?
- Increase site and web page traffic
- Do we have a method for increasing CTR?
- Do we have a method for decreasing bounce rates of site?
- Do we have a method for increase time spent on page and site?
Because Google penalizes sites for being low-content gateway or thin referral sites (see the -30 penalty below) every site must be a destination site that retains users for as long as possible. There are a number of common strategies for adding value to a site and preventing it from being marked as a gateway or referral site, though the most important thing is always to add content to a site and retain users for longer. Some common strategies include:
- Adding a blog to the site. Rumor has it that Google loves blogs, but the blog must have regular content updates – just its presence won’t help!
- Writing content for humans, not search engines.
There are a number of additional theories circulating that have not been verified. These are nothing more than ideas, but they may be valuable.
- Every page should be content-rich, and that any pages that are not will only detract from a site’s PageRank.
- Each page should have a single focus. If a page has multiple targets or foci, it should be split into separate, more focused pages.
These penalties are speculation, and penalty numbers are neither an exact nor a precise measure of the exact ranking change associated with each penalty. A penalty of -N means that a site will drop (approximately) N positions in Google’s returned results, equivalent to falling N/10 pages from the front page.
|High||Site is overly-optimized for a certain keyword||-6||Do not overly stress a single keyword.|
|Medium||Backlinks to site have over-optimized or uniform anchor text||-50||Use natural language and vary your backlink anchor text and use ‘nofollow’ where appropriate to prevent Google from crawling a link.|
|Medium||Site appears to be involved in a link-farming scheme||-60||Avoid site-wide links, links from ‘bad neighborhoods’, paid links, and links from low-quality directory sites while building quality links with varied anchor text.|
|Low||Site appears to be a spam site (a meta-penalty taking into account all the above penalties and more)||-950||Avoid other penalties while also maintaining a content-rich, clean site with good html. Avoid over-optimization.|
|Low||Site is hacked or purely a spam or scam site||delist||Failure. Start over.|
|Penalty Type||When||Detail||Actions You Can Take
|Google Vince Update||March 09||A Googler named Vince created this change and hence the name. This is not a penalty, rather an update in Google’s algorithm. Vince update seems to favor bigger brands and has pushed some of these big name sites further up the rankings.
Google’s explanation is that, It is more about factoring trust more into the algorithm for more generic queries. From what Matt has said this update is probably looking at the overall weight and trust of a site (and the big brands have spent enough marketing pounds to win here) and the theme of the site.
|Do site awareness, and brand promotion, in addition to traditional SEO work.|
|Google -6 Penalty||Late 2006||Google didn’t admit doing this to sites, per Matt Cutt. One possible trigger is that many of these sites have highly optimized pages tightly focused around a single core phrase or keyword.
Google now argues that the effect was caused by a glitch in the system and that an attempt to filter out bad sites had caught good sites in the process. Most sites should get their original rankings back soon.
|Do not overly stress on a single keyword.|
|Google -30 Penalty||Introduced in late 2006, but Google starts to aggressively enforce it in mid 2009.||A penalty widely-speculated given to thin affiliate, refer or doorway sites which do not add much value for the site visitors. However, many non-affiliate sites also have reported this penalty. Sites with excessive low quality inbound or outbound links and lots of non-unique content may have a minus 30 ranking penalty applied.
Syndrome: your well-ranked keywords (1st page) suddenly drop 30 positions.
Some of the practice below may help trigger -30 filter:
Guestbook spamming: If you try to get inbound links by spamming guest books and blogs then Google might apply the filter to your web site.
Doorway pages: Google doesn’t like doorway pages. If you must use special landing pages for PPC ads and other ads, make sure that these pages cannot be spidered by Google and other search engines. You can use robots.txt to do that
|The only solution to avoid this penalty is to have unique content on your site, get links from well trusted sites and link to high quality sites.
For detailed information, please refer to Google’s Webmaster Guidelines.
Clean up the site first and submit a reconsideration request to Google.
|Google -50 Penalty||Sept 2009||Over-optimized key anchor text on link building. This is the most recent one that generates some discussions amongst webmaster and SEO sites.
Syndrome: your well-ranked keywords (1st page) suddenly drop 50+ positions.
Good article on this topic:
|De-optimize anchor text of back linking. Use natural languages, not keywords screaming SEO.
If you’re link building, make sure your anchor text varies on each site that links to you. You do not need to have keywords stuffed on every single link.
Use “nofollow” at times.
Clean up the bad practice first and submit a reconsideration request to Google.
|Google -60 Penalty||Mid 2008||Bad back linking practice: spam back linking or potential link farming.
It looks that Google applies this penalty to websites that buy links. Many of the websites that seem to have been penalized had many inbound links from websites that linked to them from every single page of their website (so-called site-wide links). Site-wide links are an indicator of paid links, which Google sees as an unwanted way to artificially inflate search engine rankings.
The head of Google’s anti-spam team Matt Cutts has often said that websites that buy paid links will be penalized and it looks as if Google tries to do the job properly. If this penalty for paid links really exists then even websites that follow Google’s rules can get in trouble. Your competitors could harm your website simply by buying links or by creating mini-net websites with sitewide links to your website.
Syndrome: keyword rank drops 60 positions.
|Avoid site-wide linking.
Sever links from bad neighborhood.
Sever links from low quality directory sites
Sever links from link farms
Avoid paid links
Build quality links from relevant and well-trusted sites
Use varied and descriptive anchor text on links that link back to your site.
|Google -950 Penalty||Jan 07||Spam Penalty, or Over Optimization Penalty. A much dreaded site or keyword drops 950 positions in ranking. Spam liking, spam documentation, content duplication, sloppy HTML that generates many validation errors.
Overall, Google 950 penalty is Google’s means to discourage webmasters from engaging in any kind of spam activity and subtly directing them to follow the ideal SEO.
Speculation: it’s possibly related to the Spam Detection Patent invented by Googler Anna Lynn Patterson.
|Stop link farming
Provide unique and value adding content
Clean up first, and submit a reconsideration request to Google
|Delisted by Google||A hacked or a pure spam site will be delisted by Google, meaning your site will be excluded from search results. Some of the proven reasons why a site gets delisted are:
1) Repeated spelling and syntactical errors. If your website repeatedly contains a particular misspelled word, or it’s primarily made up of junk content (such as those computer generated content), you are at a high risk of being delisted from Google search.
2) Adding a large number of external links in a short time. One possible scenario is where your server is hacked and spammers add lots of links to your website without you knowing. Most of these links are hidden. You won’t see them unless you study the source code. Another possible scenario is when you are too active in link exchange. Let’s take link directories for example, most link directories will have an option for you to link back to them. If you spend one whole day exchanging links with 200 link directories, your website is at risk.
3) Sitemap error.
4) Hidden links and hidden text. Excessive use of both can get your website delisted from Google. It’s cloaking.
5) Doorway pages that redirect visitors without their knowledge use some form of cloaking. This is against Google’s principle, which is “Don’t deceive your users or present different content to search engines than you display to users.”
|Well, you just violated all possible Google webmaster guidelines. Start from scratch and rebuild your site.
Clean up first, and submit a re-inclusion request to Google
- Link and backlink quality matters more than link quantity. Site quality is determined by Google’s PageRank (see below).
- Abrupt changes in link/backlink count can actually damage a site’s PageRank.
- Excessive linking between two websites can damage the PageRank of both websites.
- Backlinks that look ‘generated’ (as if the site were involved in a link farming scheme) can damage a site’s PageRank. For example, thousands of backlinks with the same anchor text can look suspicious.
- It is believed that links from relevant sites are more heavily weighted than those from irrelevant sites.
- Links that are in larger or bolder font are weighted more heavily than those in smaller or regular fonts.
- Anchor text is associated not only with the page on which it is found, but also the page at which it points.
- This would lend credence to the idea that anchor text should be ‘natural’ and that it should be relevant to the content of the site in question
- Is text-based navigation used throughout the site?
- Does the Link anchor text contain keywords?
- Do most inbound links refer to canonical URL versions?
- Does the site leverage header, footer, or breadcrumb navigation links?
- Are back links from quality web pages
- Are back links from relevant sites
- Are back links from pages with high Page Rank
- Do back links have keyword in Anchor text?
- Are there back links from do follow sites?
- Are there back kinks from different IP Address?
- Are there back links from big or branded sites?
- Have all links been validated?
- Are there less than 100 links on a web page?
- MajesticSEO, a free backlinks and anchor text database for analyzing site linking.
- Google webmaster tools, a site for managing Google’s crawling of your site and reviewing feedback from Google.
- Google Analytics, an analytics site with a wealth of information about your site’s traffic patterns and more.
- ComScore, a provider of digital market intelligence and measurement.
- Xenu’s Link Sleuth, a program for checking a website for broken links and generating link reports.
Articles and Sources
- Google webmaster guidelines, the first place to start, always.
- Matt Cutts’ blog, a Google engineer who blogs about SEO.
- A brief overview of Googlebot, which is used by Google to index the internet.
This is a semi-technical description of PageRank and how it is calculated. If you want to skip this section, just read about the consequences below.
PageRank calculation uses the following formula: PR(A) = (1 − d) + d(PR(T1) / C(T1) + … + PR(Tn) / C(Tn)) where PR(A) is defined as the PageRank of A, T1…Tn are the pages pointing to A, and C(A) is defined as the number of links going out of page A and d is a damping factor. A damping factor simply reduces the value of all links by a set amount, based on the model that a user will periodically directly change pages without following a link. It is commonly assumed that d = 0.85 in practice.
This has a number of consequences:
- A page’s quality determines the value of its links (higher quality = more value per link).
- A page’s number of links determines the value of each of its links (more links = less value per link).
- ↑ http://www.google-success.com/google-algorithm-update-vince-favors-big-brands.htm
- ↑ http://www.mattcutts.com/blog/keywords-meta-tag-in-web-search/
- ↑ http://en.wikipedia.org/wiki/Link_farm#Guidelines
- ↑ http://en.wikipedia.org/wiki/Link_farm#Guidelines
- ↑ http://ilpubs.stanford.edu:8090/361/
- ↑ http://ilpubs.stanford.edu:8090/361/