11 Search Engine Issues & SEO Mistakes That Could Impact Your Website Rankings (and how to avoid them)

I think you’ll agree with me when I say:

Search engine traffic is highly essential to build a profitable website.

Studies have long shown that organic search delivers the best ROI to businesses.

The most relevant people are more likely to end up on your website via a search engine.

However, search engines will only reward you if you follow their guidelines.

In this article, you’ll find various search engine issues you should be aware of and SEO mistakes you need to avoid.

If you’ve been involved in the search industry for a while, many of these may not be so new. Nevertheless, you still find tips to safeguard your website rankings.

Exact Match & Keyword-Rich Anchor Texts

According to Moz:

“Anchor text is the visible characters and words that hyperlinks display when linking to another document or location on the web”.

Anchor texts always matter in SEO because search engines continue to use link relevancy as a ranking factor. This link relevancy can be determined by both the content of the linking page and the content of the anchor text.

An ‘exact match anchor text’, on the other hand, is simply an exact match of whatever keyword you’re targeting.

Example: if “green red roses” was your target keyword, then an exact match anchor would be “green red roses”.

So, how can exact match anchor texts hurt your website rankings?

exact-match-keyword-rich-anchor-texts

In the days before Google launched its Penguin Algorithm update, you could literally use 100% exact match anchor texts in all your backlinks and you would get away with high rankings.

As you would expect, many web masters over-optimized their anchors in a bid to gain quick rankings. That was until Google came up with the penguin update that targeted low – quality, artificial, or spammy link building tactics.

Today, Google can easily determine whether or not a site should be punished just by analyzing anchor text.

For example, if you optimized your on-page content for “green red roses” and 100% of your anchor texts are “green red roses”, you will most likely get penalized because Google’s algorithm can easily tell that:

  1. you’re trying to rank for “green red roses”
  2. you’re building links unnaturally( it is very unlikely that people would naturally link to you with exact match keyword-rich anchor text 100% of the time)

To avoid getting penalized, you should try to vary your anchor texts and use exact match anchors less than 1% of the time.

Gotch SEO has a massive guide that explains everything you’ll ever need to know about how to use anchor text in a tricky post-Penguin world.

Additional Resources:

Manipulative Link Acquisition

Manipulative links are links intended to deviously influence a site’s ranking in search engine results.

These are considered a violation of Google’s Webmaster Guidelines and comprise any behavior that manipulates links to your site or outgoing links from your site.

Video: Google Webmaster Guidelines
 
Manipulative link acquisition continues to be one of the most popular form of web spam as many people, in a bid to artificially improve their web visibility, still attempt to exploit the search engines’ use of link popularity in their ranking algorithms.

Because they come in many forms, search engines have often found it difficult to completely rid the web of deceptive link building techniques.

From the official Google Quality Guidelines, the following are examples of manipulative linking which might damage your site’s ranking in search results:

link schemes
  •  Paid Links: Includes exchanging money for links, or sponsored posts that contain links; exchanging gifts for links; exchanging products for links, etc
  • Link Networks: Where networks of sites are built purely as link sources to deceptively inflate popularity. Otherwise called link farms
  • Reciprocal Link exchanges such as partner pages exclusively for the sake of cross-linking
  • Mass article marketing or guest posting campaigns with keyword-rich anchor text links
  • Use of automated programs or services to create links to your site
  • Text advertisements that pass link juice
  • Advertorials where payment is received for articles that include links that pass link juice
  • Links with optimized anchor text in articles or press releases distributed on other sites.
  • Low-quality directory or bookmark site links
  • Keyword-rich, hidden or low-quality links embedded in widgets that are distributed across various sites
  • Widely distributed links in the footers or templates of various sites
  • Forum comments with optimized links in the post or signature

If you constantly engage in the manipulative activities described above, you might damage your website rankings in search engines or even have your site banned from their index.

The best way, according to Google, is to create unique, useful & relevant content, and then making that content visible to high-quality, relevant sites that are likely to find that content valuable to their readers and link to it.

Additional Resources:

 

Duplicate Content

Ever since the Panda algorithm update was first launched, Google has been aggressively fighting to eliminate duplicate content from its search engine results pages.

According to Google’s webmaster guidelines:

duplicate content

Duplicate content on websites is something that Google and other search engines frown upon.

From a search engine’s perspective, duplicate content leads to a bad user experience when a visitor sees the same content repeated within a set of search engine results.

Therefore, search engines try hard to index and display pages with distinct information.

If your website contains multiple pages with largely identical information, and search engines think that you’re trying to manipulate rankings and deceive search users, your site’s rankings might drop or your site might be de-indexed in which case it will no longer appear in search results.

For eCommerce sites, especially, having a large amount of duplicate content is a serious SEO issue that could spell disaster in search engine rankings.

This guide by Dan Kern at inflow digs deep into a wide variety of duplicate content issues found on eCommerce websites.

Google provides some steps you can take to address duplicate content issues on you website. These include:

  • using 301 redirects
  • maintaining consistency in your internal linking
  • using top-level domains to handle country-specific content
  • ensuring that each site on which your content is syndicated includes a link back to your original article
  • using search console to tell Google your preferred domain
  • minimizing boilerplate repetitions
  • avoiding ’empty’ pages (placeholders), i.e don’t publish pages for which you don’t yet have real content
  • being familiar with how your CMS handles your content

Additional Resources:

Scraped & Thin Content

If you use content taken from other websites without adding any original content or value, you seriously need to re-think your website strategy.

Google states:

Purely scraped content, even from high-quality sources, may not provide any added value to your users without additional useful services or content provided by your site; it may also constitute copyright infringement in some cases.

Examples of scraping include:

  • copying and republishing content from other sites without adding any original content or value
  • copying content from other websites, modifying it slightly, and then republishing it
  • Sites that reproduce content feeds from other sites without providing some type of unique organization or benefit to the user
  • Sites dedicated to embedding content such as video, images, or other media from other sites without substantial added value to the user

If your site does not have enough unique content that differentiates it from other sites on the web, it might suffer in Google’s search rankings.

According to Google, such websites ‘lack flesh’ and provide very little additional value to web users.

Thin content issues are prevalent in affiliate sites especially those that mainly feature content from affiliate networks – some affiliate programs distribute the same content to several hundred affiliates.

Example of thin affiliate sites:

” pages with product affiliate links on which the product descriptions and reviews are copied directly from the original merchant without any original content or added value.”

If you participate in an affiliate program, Google has several recommendations that can help your site stand out and rank well:

  • Offer original product reviews, ratings, and product comparisons
  • Affiliate program content should form only a small part of your website’s content
  • Ensure your site adds substantial value beyond simply republishing content available from the original merchant.
  • When selecting an affiliate program, choose a product category appropriate for your intended audience.
  • Keep your content updated and relevant
  • Use your website to build community among your users

Bottom line: Focus on unique, relevant content that provides value to your visitors and distinguishes your site from other websites. This will make it more likely to rank well in Google search results.

Additional Resources:

 

Keyword Stuffing

I know…I’m also amazed that people still engage in keyword stuffing today despite Google having made it clear that irrelevant keywords result in a negative user experience and can harm your site’s ranking.

From Google’s Quality Guidelines:

keyword stuffing

A good keyword stuffing example is repeating the same words or phrases so often that it sounds unnatural.

Like this:

We sell ‘custom cigar humidors’. Our ‘custom cigar humidors’ are handmade. If you’re thinking of buying a ‘custom cigar humidor’, please contact our ‘custom cigar humidor’ specialists at [email protected]

Rather than load your webpage with keywords, focus on creating useful, information-rich content using keywords appropriately as Google advises.

Additional Resources:

User-Generated Spam

Do you allow user-generated content on your site?

If yes, you need to closely monitor what those users are posting.

Spammy user-generated content can pollute Google search results.

So, if Google detects user-generated spam on your site, you may receive a warning or Google may take manual action on your whole site if that spam is too much.

 

Having user-generated spam doesn’t mean your site is low quality.

Google says, “such spam can be generated on good sites by malicious visitors and users.”

In fact, trusted sites like Mozilla have been penalized for spam in the past.

User- generated spam could include:

  • spammy posts on forum threads
  • spammy user profiles
  • spammy blog comments
  • spammy guestbook comments
  • spammy accounts on free hosts

To stay safe, Google recommends you actively monitor and remove this type of spam from your site. You should also consider implementing these measures to prevent user-generated spam.

Additional Resources:

Use of Automatically Generated Content

There are many software tools on the internet that can help you put your site on auto-pilot mode.

Such tools may use feeds to auto generate content and some can even stitch content from different web sites to form new content.

A good example of this is a job board where the job listings are auto-generated using a script that pulls information from other sites around the internet.

If Google determines that the content on that site is not original and does not add significant value, then it likely won’t rank well in Google results.

Other examples include sites made up entirely of feed displays, affiliate product feeds, and product descriptions generated from other websites.

From Google’s perspective, if a searcher lands on an auto-generated page that has very little value and doesn’t find what they’re looking for, that results in a bad user experience, which is why Google would take action against such sites.

 

Google has several examples of auto-generated content:

  • Text translated by an automated tool without human review or curation before publishing
  • Text generated through automated processes, such as Markov chains
  • Text generated using automated synonymizing or obfuscation techniques
  • Text generated from scraping Atom/RSS feeds or search results
  • Stitching or combining content from different web pages without adding sufficient value

Additional Resources

Is Your Site Fast Enough?

Everyone loves fast sites.

So do search engines.

In fact, Google considers site speed vital to a good user experience.

Since 2010, Google has used site speed as a signal in their ranking algorithms – Using site speed in web search ranking.

This patent published in February 2014, details how web pages that load faster can receive a ranking boost while slower pages can be demoted:

“A search result for a resource having a short load time relative to resources having longer load times can be promoted in a presentation order, and search results for the resources having longer load times can be demoted.”

So, while site speed may not carry as much weight as other factors like page relevancy, a slow page speed means search engines will spend more resources to crawl your pages, which could negatively impact your indexation.

Page speed is also crucial to user experience. And we all know how Google rates user experience highly.

Pages that take long to load tend to have higher bounce rates and lower average time on page.

Therefore, your best bet is to ensure your website loads fast in all countries, across all devices and browsers.

Additional Resources:

Non Unique Title Tags and Meta Descriptions

Title and description tags are the most important meta tags recognized by Google.

Within SEO circles, title tags are often considered the most important on-page element.

They are a strong relevancy signal for search engines because they tell them what your web page is really about.

So, while it’s important to include your keyword in the title, it’s equally vital that each of your page has a unique title.

With duplicate titles, search engines will have trouble determining which of your website’s page is relevant to a particular search query.

Therefore, pages with duplicate titles will have fewer chances to rank high and might even affect the rankings of other pages on your website.

According to Matt Cutts, Google’s former head of search spam, it is better to have unique meta descriptions or even no meta descriptions at all, than to have duplicate meta descriptions across your pages.

Even though meta descriptions are believed to have little or no SEO value, it’s still vital to have unique meta descriptions (if you choose to include them).

The more original, unique, and compelling your meta descriptions are, the more likely they will lead to higher click-through-rates.

Duplicate meta descriptions might confuse users and lead to bad user experience which might impact your rankings.

Google Will Take Further Action Against Sites That Repeatedly Violate Its Webmaster Guidelines

Do you violate Google’s webmaster guidelines over and over again?

Google may have stiffer penalties for you.

In a recent post at the Google webmaster blog, the Google Search Quality team said Google may take ‘further action’ against sites that repeatedly violate their webmaster guidelines.

Google will also make “a successful reconsideration process more difficult to achieve” for such websites.

The image below shows an example as given in that post:

GOOGLE-stiffer-penalties

Check the Google Webmaster blog to learn more about what Google has in store for such violations

Avoiding Google Penalties

As a website owner who relies on search engine traffic, you always want to avoid getting a Google penalty.

You therefore need to make sure you aren’t breaking search engine rules.

To help you stay on Google’s safe side, Neil Patel at Quick Sprout not long ago created a gem-packed infographic that shows you what you should and shouldn’t do.

Below is a quick summary of the guidelines recommended in that infographic.

Avoid:

  • link farms or link networks
  • use of hidden text
  • use of unauthorized computerized and automated programs for page and site submissions
  • heavy use of keywords that are not in the context of the web page
  • duplicate or thin content
  • keyword-rich anchor text

Do:

  • use guest posting sparsely. Keep guest post links below 20% of your backlink profile. Add contextual links in your guest posts.
  • when building backlinks, focus on relevance of the linking domain
  • consider using the “nofollow” tag for paid links
  • diversify your anchor texts. Replicate a real natural link profile
  • build brand signals
  • build real social signals. Avoid fake followers
  • focus on improving your site’s trust

Here’s a link to that infographic: How to Avoid a Google Penalty

Now it’s Your Turn

You’ve just seen several search engine issues you need to be aware of and guidelines to help you avoid Google penalties.

Now is time to hear from you.

How do you adhere to search engine guidelines?

Do you make any SEO mistakes?

Your comments and social shares are highly welcome.

Need SEO Services?

Want to gain more qualified traffic from Google and other search engines?

Learn how our SEO services can skyrocket your business

Posted in Search Engine Optimization and tagged , .

Norbert Juma

Norbert is a Kenyan SEO and online marketing pro. He has a background in computer science and owns several websites including BiasharaOnline.co.ke

2 Comments

  1. Norbert,

    Thanks for mentioned me! This was a great guide for any business owner looking to avoid some of the biggest and most costly SEO mistakes. Keep up the great work.

    – Gotch

    • Hey Gotch, welcome to my blog. Your SEO guides are full of practical tips and I do recommend them.

      Sure…many business owners can learn to adhere to search engine guidelines with this post.

      Regards and Thanks for stopping by.

      ~Norbert

Leave a Reply

Your email address will not be published. Required fields are marked *