Build a business you love to lead. Create the life you want.
Is Your SEO Strategy Crippling Your Site Traffic?
By christine|Last Updated: April 8, 2016|6 min read|
Besides knowing how to improve your search engine optimization, every company needs to know which SEO practices to keep away from. Some SEO practices, which were sought-after in the past, can cause penalties that can harm your site’s search rankings for weeks, months and even years.
This can happen accidentally to an average website owner as he or she can easily make a mistake when optimizing, architecturing, getting links and structuring data. The content on your site needs to be relevant to your business and your customers, and you should not try to gamble with Google’s algorithms.
What follows are several SEO practices which you should avoid if you are concerned with your site’s search rankings.
Thin or Shallow Content
Google launched its Panda algorithm update in 2011, and that is when search engines started to punish sites with content which was too thin or shallow. Such content can be characterized by being poorly written, does not include supporting references, or it focuses on a topic irrelevant to the site’s audience or field of knowledge. Sites with high-quality, authentic content that is relevant to their target audiences and provide a positive user experience, are favored by search engines.
Nowadays, a site’s ranking can be affected by even the slightest amount of shallow or thin content. To avoid this, you need to make sure that all of your site’s content is properly researched, skillfully edited, that it is not copied, and is related to your field of knowledge. You should not be too concerned with including the right keywords as search engines are not interested in how much effort you put in; providing original content that is valuable to your visitors is much more important.
A writer or content manager, who is up-to-date with SEO, should be able to create content for your site which avoids the punishment for thin or shallow content. You can take a look at Neil Patel’s associates – online marketing gurus case studies page, to see how a well executed SEO strategy can benefit a website or a business.
Matt Cutts, head of Google’s spam team, says that at least 25 percent of all content on the web is duplicated. There are several ways in which duplicate content negatively impacts SEO, all associated with search engine’s efforts to determine which version of the content to index and rank. Duplicate content can cause a drop in traffic and less relevant search results.
To avoid duplicate content penalties from affecting your site, you need to make sure that each piece of information is always indexed under the same URL, and that any other URL that serves the same page has a canonical tag. This means that when you have multiple pages displaying the same content, the one which contains the canonical tag in its header or footer will signal search engines that this page has priority over any other containing the same content.
This system is ideal for ecommerce websites which often automatically create multiple versions of the same page for a certain product – .com/tshirt, /tshirt-blue, /tshirt-red, etc.. In this case canonical tag tells the search engines that the /tshirt page is the original one and that it should be prioritised in search results. Another option for dealing with duplicate content is setting up a 301 redirect. This way you point users directly to a “parent” page no matter which page they open. According to MOZ, pages with 301 redirects don’t lose too much of the link juice. MOZ researchers concluded that they get 91-99% of the link juice a regular page without redirects would usually get.When content moves, you should set up a 301 redirect from the old URL to the present page. This way, their combined ranking signals work together to provide you with a better search engine ranking.
In previous years a typical SEO practice was to solicit inbound links with keyword-rich anchor text from a variety of sites, blogs and forums. The purpose was to exploit the ranking factor called “Pagerank,” which regulates a page’s value by the amount of inbound links.
But, some underhanded SEO agencies forged this ranking factor by buying links or spamming blogs and forums in an attempt to artificially increase positioning. In 2012, Google launched Penguin algorithm update to punish sites for this blackhat practice.
Some sites still suffer from this problem for various reasons. Maybe a member of your team with an outdated SEO knowledge is unknowingly adding spammy links, or a link-building package bought years ago has made unnatural links which are still active. It is also possible that someone has hacked into your site to add links to a site they want to promote, or your competition is using some negative SEO.
You can use the reports available in your Webmaster Tools to monitor the amount and quality of links to your site, to protect yourself from unnatural links. Reviewing links should be done at least once per month, looking for signs such as a sharp and surprising drop in organic traffic or a “manual action” message in Webmaster Tools.
If any of these two occurs, you should hire an SEO expert who specializes in recovery from search engines punishments caused by unnatural links. It is of great importance to hire an expert, because otherwise, the process can last for months.
Overloading on anchor text links
An anchor text link is a particular keyword or a phrase on your site that is hyperlinked to a website URL. An instance of this would be “great website about cats” where it is linking to TheCatSite.com
Although keyword links can be good for SEO, you need to be cautious when you use them. There was a time when webmasters were able to boost thousands of anchor text links with certain keywords in them and that would get their site to rank for those keywords, but that time is no more.
The keyword ratio needs to be normal if you want to build the best ranking. In other words, if you have five thousand keyword links for one phrase you want to rank and none for other phrases on your site, Google will become suspicious. The solution is to have many keyword links for a variety of keywords. So, if you want to rank for “I want to buy a cat,” your keyword anchor text links should be something like:
I want to buy a cat
Tips to buy a cat
Owning a cat
Cute breeds of cats
How much do cats cost
Smartest cats to buy
All of these phrases have to do with what customers may be interested in when they are searching “I want to buy a cat,” but at the same time they are all saying slightly different things. You need to make sure that you do not build too many links for just one phrase, because that will not appear normal. Keywords today are not built for search engines, but for people and user intent.
Analysis is Safety
As even the best SEO intentions can prove to be very costly, you should have people on your team whose job is to monitor the evolution of search engine changes. Such people should also be able to apply the insight they gather from monitoring. Big picture perspective and understanding all the little details is crucial for creating a safe and prosperous strategy.
Bio: Marcus Jensen is an IT professional. He is an Editor-in-Chief of technivorz.com, and writes about technology, business and marketing.