I think you’ll agree with me when I say:
Search engine traffic is highly essential to build a profitable website.
Studies have long shown that organic search delivers the best ROI to businesses.
The most relevant people are more likely to end up on your website via a search engine.
However, search engines will only reward you if you follow their guidelines.
In this article, you’ll find various search engine issues you should be aware of and SEO mistakes you need to avoid.
If you’ve been involved in the search industry for a while, many of these may not be so new. Nevertheless, you still find tips to safeguard your website rankings.
Exact Match & Keyword-Rich Anchor Texts
According to Moz:
“Anchor text is the visible characters and words that hyperlinks display when linking to another document or location on the web”.
Anchor texts always matter in SEO because search engines continue to use link relevancy as a ranking factor. This link relevancy can be determined by both the content of the linking page and the content of the anchor text.
An ‘exact match anchor text’, on the other hand, is simply an exact match of whatever keyword you’re targeting.
Example: if “green red roses” was your target keyword, then an exact match anchor would be “green red roses”.
So, how can exact match anchor texts hurt your website rankings?
In the days before Google launched its Penguin Algorithm update, you could literally use 100% exact match anchor texts in all your backlinks and you would get away with high rankings.
As you would expect, many web masters over-optimized their anchors in a bid to gain quick rankings. That was until Google came up with the penguin update that targeted low – quality, artificial, or spammy link building tactics.
Today, Google can easily determine whether or not a site should be punished just by analyzing anchor text.
For example, if you optimized your on-page content for “green red roses” and 100% of your anchor texts are “green red roses”, you will most likely get penalized because Google’s algorithm can easily tell that:
- you’re trying to rank for “green red roses”
- you’re building links unnaturally( it is very unlikely that people would naturally link to you with exact match keyword-rich anchor text 100% of the time)
To avoid getting penalized, you should try to vary your anchor texts and use exact match anchors less than 1% of the time.
Gotch SEO has a massive guide that explains everything you’ll ever need to know about how to use anchor text in a tricky post-Penguin world.
- 5 SEO Mistakes That Even Experts Miss (By Neil Patel at Quick Sprout)
- Post-Penguin Anchor Text Case Study (By Court Tuttle at Moz)
- How to Avoid Over-Optimizing Your Website (By Neil Patel at KISSMetrics)
- What Is a “Good Link Profile” and How Do You Get One?
Manipulative Link Acquisition
Manipulative links are links intended to deviously influence a site’s ranking in search engine results.
These are considered a violation of Google’s Webmaster Guidelines and comprise any behavior that manipulates links to your site or outgoing links from your site.
Because they come in many forms, search engines have often found it difficult to completely rid the web of deceptive link building techniques.
From the official Google Quality Guidelines, the following are examples of manipulative linking which might damage your site’s ranking in search results:
- Paid Links: Includes exchanging money for links, or sponsored posts that contain links; exchanging gifts for links; exchanging products for links, etc
- Link Networks: Where networks of sites are built purely as link sources to deceptively inflate popularity. Otherwise called link farms
- Reciprocal Link exchanges such as partner pages exclusively for the sake of cross-linking
- Mass article marketing or guest posting campaigns with keyword-rich anchor text links
- Use of automated programs or services to create links to your site
- Text advertisements that pass link juice
- Advertorials where payment is received for articles that include links that pass link juice
- Links with optimized anchor text in articles or press releases distributed on other sites.
- Low-quality directory or bookmark site links
- Keyword-rich, hidden or low-quality links embedded in widgets that are distributed across various sites
- Widely distributed links in the footers or templates of various sites
- Forum comments with optimized links in the post or signature
If you constantly engage in the manipulative activities described above, you might damage your website rankings in search engines or even have your site banned from their index.
The best way, according to Google, is to create unique, useful & relevant content, and then making that content visible to high-quality, relevant sites that are likely to find that content valuable to their readers and link to it.
Ever since the Panda algorithm update was first launched, Google has been aggressively fighting to eliminate duplicate content from its search engine results pages.
According to Google’s webmaster guidelines:
Duplicate content on websites is something that Google and other search engines frown upon.
From a search engine’s perspective, duplicate content leads to a bad user experience when a visitor sees the same content repeated within a set of search engine results.
Therefore, search engines try hard to index and display pages with distinct information.
If your website contains multiple pages with largely identical information, and search engines think that you’re trying to manipulate rankings and deceive search users, your site’s rankings might drop or your site might be de-indexed in which case it will no longer appear in search results.
For eCommerce sites, especially, having a large amount of duplicate content is a serious SEO issue that could spell disaster in search engine rankings.
This guide by Dan Kern at inflow digs deep into a wide variety of duplicate content issues found on eCommerce websites.
Google provides some steps you can take to address duplicate content issues on you website. These include:
- using 301 redirects
- maintaining consistency in your internal linking
- using top-level domains to handle country-specific content
- ensuring that each site on which your content is syndicated includes a link back to your original article
- using search console to tell Google your preferred domain
- minimizing boilerplate repetitions
- avoiding ’empty’ pages (placeholders), i.e don’t publish pages for which you don’t yet have real content
- being familiar with how your CMS handles your content
- Google’s Official Guidelines on Duplicate Content
- Duplicate Content in a Post Panda World (by Dr. Pete Meyers at Moz)
- The Complete Guide to Mastering Duplicate Content Issues (by Stoney G’deGeyter via SEJ)
- Thin & Duplicate Content: eCommerce SEO (by Dan Kern of Inflow)
- Duplicate Content SEO Advice From Google (by Shaun Anderson at hobo internet marketing)
Scraped & Thin Content
If you use content taken from other websites without adding any original content or value, you seriously need to re-think your website strategy.
Purely scraped content, even from high-quality sources, may not provide any added value to your users without additional useful services or content provided by your site; it may also constitute copyright infringement in some cases.
Examples of scraping include:
- copying and republishing content from other sites without adding any original content or value
- copying content from other websites, modifying it slightly, and then republishing it
- Sites that reproduce content feeds from other sites without providing some type of unique organization or benefit to the user
- Sites dedicated to embedding content such as video, images, or other media from other sites without substantial added value to the user
If your site does not have enough unique content that differentiates it from other sites on the web, it might suffer in Google’s search rankings.
According to Google, such websites ‘lack flesh’ and provide very little additional value to web users.
Thin content issues are prevalent in affiliate sites especially those that mainly feature content from affiliate networks – some affiliate programs distribute the same content to several hundred affiliates.
Example of thin affiliate sites:
” pages with product affiliate links on which the product descriptions and reviews are copied directly from the original merchant without any original content or added value.”
If you participate in an affiliate program, Google has several recommendations that can help your site stand out and rank well:
- Offer original product reviews, ratings, and product comparisons
- Affiliate program content should form only a small part of your website’s content
- Ensure your site adds substantial value beyond simply republishing content available from the original merchant.
- When selecting an affiliate program, choose a product category appropriate for your intended audience.
- Keep your content updated and relevant
- Use your website to build community among your users
Bottom line: Focus on unique, relevant content that provides value to your visitors and distinguishes your site from other websites. This will make it more likely to rank well in Google search results.
I know…I’m also amazed that people still engage in keyword stuffing today despite Google having made it clear that irrelevant keywords result in a negative user experience and can harm your site’s ranking.
From Google’s Quality Guidelines:
A good keyword stuffing example is repeating the same words or phrases so often that it sounds unnatural.
“We sell ‘custom cigar humidors’. Our ‘custom cigar humidors’ are handmade. If you’re thinking of buying a ‘custom cigar humidor’, please contact our ‘custom cigar humidor’ specialists at [email protected]”
Rather than load your webpage with keywords, focus on creating useful, information-rich content using keywords appropriately as Google advises.
- The Dangers of SEO Keyword Staffing (by Megan Marrs at WordStream)
- Several Forms of Keyword Stuffing that Should be Avoided (on BruceClay.com)
- Myths and Misconceptions about Search Engines ( by Rand Fishkin and Moz Staff )
- How much keyword density for Google? (Moz Community)
- What Is The IDEAL Keyword Density Percentage To Improve Rankings? (by Shaun Anderson at hobo internet marketing)
Do you allow user-generated content on your site?
If yes, you need to closely monitor what those users are posting.
Spammy user-generated content can pollute Google search results.
So, if Google detects user-generated spam on your site, you may receive a warning or Google may take manual action on your whole site if that spam is too much.
Having user-generated spam doesn’t mean your site is low quality.
Google says, “such spam can be generated on good sites by malicious visitors and users.”
User- generated spam could include:
- spammy posts on forum threads
- spammy user profiles
- spammy blog comments
- spammy guestbook comments
- spammy accounts on free hosts
To stay safe, Google recommends you actively monitor and remove this type of spam from your site. You should also consider implementing these measures to prevent user-generated spam.
- Offical Google Guidelines on user-Generated Spam
- Google Manual Actions: User-Generated Spam
- Google Hits Mozilla With Spam Penalty Over User Generated Content (by Danny Sullivan at Search Engine Land)
- How to Avoid User-Generated Spam (by Warren Lee at Adobe)
- Google: Spam Is A Hard Problem For User Generated Content Platforms (by Barry Schwartz at Search Engine Roundtable)
- User Generated Spam Penalty Recovery (by Tim Hill at DOODLEDdoes)
Use of Automatically Generated Content
There are many software tools on the internet that can help you put your site on auto-pilot mode.
Such tools may use feeds to auto generate content and some can even stitch content from different web sites to form new content.
A good example of this is a job board where the job listings are auto-generated using a script that pulls information from other sites around the internet.
Other examples include sites made up entirely of feed displays, affiliate product feeds, and product descriptions generated from other websites.
From Google’s perspective, if a searcher lands on an auto-generated page that has very little value and doesn’t find what they’re looking for, that results in a bad user experience, which is why Google would take action against such sites.
Google has several examples of auto-generated content:
- Text translated by an automated tool without human review or curation before publishing
- Text generated through automated processes, such as Markov chains
- Text generated using automated synonymizing or obfuscation techniques
- Text generated from scraping Atom/RSS feeds or search results
- Stitching or combining content from different web pages without adding sufficient value
- Google Doesn’t Want Auto Generated Content in Their Results (by Patrick Sexton at Varvy)
- Matt Cutts on Auto-Generated Content: Google Will Take Action (by Jennifer Slegg of TheSEMPost)
- Official Google Quality Guidelines on Auto Generated Content
Is Your Site Fast Enough?
Everyone loves fast sites.
So do search engines.
In fact, Google considers site speed vital to a good user experience.
Since 2010, Google has used site speed as a signal in their ranking algorithms – Using site speed in web search ranking.
This patent published in February 2014, details how web pages that load faster can receive a ranking boost while slower pages can be demoted:
“A search result for a resource having a short load time relative to resources having longer load times can be promoted in a presentation order, and search results for the resources having longer load times can be demoted.”
So, while site speed may not carry as much weight as other factors like page relevancy, a slow page speed means search engines will spend more resources to crawl your pages, which could negatively impact your indexation.
Page speed is also crucial to user experience. And we all know how Google rates user experience highly.
Pages that take long to load tend to have higher bounce rates and lower average time on page.
Therefore, your best bet is to ensure your website loads fast in all countries, across all devices and browsers.
- 5 SEO Tips To Boost Page Speed (by Daniel Cristo at Search Engine Land)
- The Advanced Guide to SEO: Site Speed and Performance (By Neil Patel and Sujan Patel at Quick Sprout)
- How Website Speed Actually Impacts Search Ranking (By Billy Hoffman at MOZ)
- SEO 101: How Important is Site Speed (by Albert Costill at Search Engine Journal)
Non Unique Title Tags and Meta Descriptions
Title and description tags are the most important meta tags recognized by Google.
Within SEO circles, title tags are often considered the most important on-page element.
They are a strong relevancy signal for search engines because they tell them what your web page is really about.
So, while it’s important to include your keyword in the title, it’s equally vital that each of your page has a unique title.
With duplicate titles, search engines will have trouble determining which of your website’s page is relevant to a particular search query.
Therefore, pages with duplicate titles will have fewer chances to rank high and might even affect the rankings of other pages on your website.
According to Matt Cutts, Google’s former head of search spam, it is better to have unique meta descriptions or even no meta descriptions at all, than to have duplicate meta descriptions across your pages.
Even though meta descriptions are believed to have little or no SEO value, it’s still vital to have unique meta descriptions (if you choose to include them).
The more original, unique, and compelling your meta descriptions are, the more likely they will lead to higher click-through-rates.
Duplicate meta descriptions might confuse users and lead to bad user experience which might impact your rankings.
Google Will Take Further Action Against Sites That Repeatedly Violate Its Webmaster Guidelines
Do you violate Google’s webmaster guidelines over and over again?
Google may have stiffer penalties for you.
In a recent post at the Google webmaster blog, the Google Search Quality team said Google may take ‘further action’ against sites that repeatedly violate their webmaster guidelines.
Google will also make “a successful reconsideration process more difficult to achieve” for such websites.
The image below shows an example as given in that post:
Check the Google Webmaster blog to learn more about what Google has in store for such violations
Avoiding Google Penalties
As a website owner who relies on search engine traffic, you always want to avoid getting a Google penalty.
You therefore need to make sure you aren’t breaking search engine rules.
To help you stay on Google’s safe side, Neil Patel at Quick Sprout not long ago created a gem-packed infographic that shows you what you should and shouldn’t do.
Below is a quick summary of the guidelines recommended in that infographic.
- link farms or link networks
- use of hidden text
- use of unauthorized computerized and automated programs for page and site submissions
- heavy use of keywords that are not in the context of the web page
- duplicate or thin content
- keyword-rich anchor text
- use guest posting sparsely. Keep guest post links below 20% of your backlink profile. Add contextual links in your guest posts.
- when building backlinks, focus on relevance of the linking domain
- consider using the “nofollow” tag for paid links
- diversify your anchor texts. Replicate a real natural link profile
- build brand signals
- build real social signals. Avoid fake followers
- focus on improving your site’s trust
Here’s a link to that infographic: How to Avoid a Google Penalty
Now it’s Your Turn
You’ve just seen several search engine issues you need to be aware of and guidelines to help you avoid Google penalties.
Now is time to hear from you.
How do you adhere to search engine guidelines?
Do you make any SEO mistakes?
Your comments and social shares are highly welcome.
Need SEO Services?
Want to gain more qualified traffic from Google and other search engines?