When it comes to getting more traffic to a website, many have worked in ways that trump the Search Engine Robots. Unfortunately for them, those robots are becoming more and more intelligent and as a result prevent them from getting any traffic to their website.
1. Search Engine Cloaking
Cloaking is one of the worst things you can do. If you are a beginner, you probably have no idea what this technique is. Either way, I think it is essential for you to know. You must avoid the problem with your website. It could happen as you hire someone to help you with your SEO and this method is suggested as a valid method to get you to rank quickly.
Search Engine Cloaking happens when accessing a page gives a different result to the search engine than it is for regular visitors to your website. Say that your website talks about flowers. You gain a lot of backlinks and since you talk about all the flowers one can find at the flower shop, it gains great popularity increasing your rank to the high spheres.
Now that website owner decides to sell his ranking and lets Google see the existing old content: flowers, but when a regular user comes to check out the page, they instead redirect them to another site selling lingerie. As we understand here, the redirect is not expected by Google and obviously, the user is going to be disappointed about the results since he searched for flowers and ended up on a site selling various types underwear…
This is a 100% sure way to get your domain banned.
2. Keyword Manipulation
2.1 Keyword Stuffing
All the keyword manipulations can be viewed as Keyword Stuffing, so I wanted to include a definition:
The act of adding a keyword over and over again even when it does not make sense in the sentence. For example, if I were to write Niche Website or Internet Affiliate 10 times in a row, that would be keyword stuffing. Another example, if I have 100 keywords (it could be 10 variations of my main 10 keywords) and make sure to include all 100 keywords on every single one of my pages.
I would like to say that it has been over 10 years that this back hat trick stopped working.
2.2 Invisible or Minuscule Keywords
Google, at some point, decided to render the pages that they find. The reason is simple, in some cases, people would put their keywords on all their Niche Website pages. Because these would not automatically make sense, the would make those keyword appear white on white (or black on black.) In effect, they would be invisible.
Since Google said they were now capable of detecting such tricks, people decided to instead resize the font instead of changing the color. Now the keywords would still be there but they would appear so small that the reader would skip over them.
2.3 Competitor Keywords Hijack
In the early 2000s, some so-called SEO experts noticed that it was possible to get clicks from search engines for certain popular things without any real need to talk about anything. The only thing that was needed was for that keyword to appear on your page. The example I saw back then was about Pepsi Cola. The page in question had this brand name repeated 1,000 times making Google (at the time) think that the page was talking about the drink and as a result, it would send many people to the page whenever they made a search about Pepsi Cola…
The page, however, had nothing to do with drinks, cola, Pepsi… It did not mention any of these keywords even once in its content.
If you attempt to do that today, you can be sure that your website will quickly disappear from the living and kicking.
2.4 Stacking Title
Adding a title, sub-title, sub-sub-title… just to include all your keywords in your page <title> tag.
A title can be long (I like long titles), but it should not be repeating things over and over again. Actually, it should not repeat anything unless really appropriate.
It is much better to write a little more content to get your keywords on your page. Or if you have a sidebar with your info, stick your keywords in that sidebar info. At the same time, don’t write 4,000 words in your sidebar info… that’s not wise. Put a link to a page where you can talk about yourself at length.
3. Massive Amount of Paid Backlinks
Back in mid-2000 until June 2012, this was common practice. Overstock, Dun & Bradstreet, JC Penny, and Forbes were all affected by this change in the Google search engine. (Yes! Even large well-known companies got hit by Black Hat methods.) Note that paid backlink reporting started sooner: around 2007. However, that was manual work and most people probably had no idea what it looked like. 2012 is when Google decided that their automatic paid backlink detection system was ready to hit the market.
Google noticed that many links were not of any interest and were likely on a website because the owner of the website got paid (hence the “Paid Back Link” title) to place that link there. This is viewed as a sponsor or advertisement and such links are expected to use the rel=”nofollow” attribute.
The primary way Google uses to detect this problem: they count the number of backlinks to your website over time and that tells them whether it is a natural growth or not. When backlinks to one website appear by the hundreds or even thousands all at once, that’s a sure sign that the backlinks are not natural.
Another method they use is the likelihood for a link to appear on a given website. If the content of both sites is not related at all, having links from one to the other is suspicious, to say the least. Along with that, paid links tend to appear quickly and if both problems occur at about the same time, it is another sure sign that those links are Paid Backlinks.
3.1 Is This Technique Still Working?
Until about 2014, many websites still allowed external links without the rel=”nofollow” attribute. Even Wikipedia allowed such links way back.
With the much heavier use of the rel=”nofollow” anchor attribute, the Paid Backlinks technique has nearly faded out since it is becoming really difficult to get fully automated backlinks in the first place.
3.2 Isn’t Pinging Creating Backlinks?
I’m glad you asked! Yes. It is.
There are several issues with this technique which I strongly recommend you do not use. By default, you will get your content posted on WordPress and you should limit it at that. You could even remove that one ping because you have a sitemap.xml file and that’s enough for you to get noticed by search engines. They will find your page quickly no matter what and it will get indexed at some point anyway.
However, adding many ping services to your WordPress or any website won’t help you get more traffic. It won’t help you increase your ranking. It would probably even be the opposite. Having links on many websites that are considered spammy is a really bad idea. Many of those ping services do not attempt to prevent spam so spammers love them. It’s a good way for them to get many backlinks for free (i.e. just put all those ping-o-matic links in your WordPress system and you get that many backlinks each time you write a new post or make an update!)
Another reason for these systems to be viewed as spam is because each time you make an update it is going to send a new ping. I do several updates in a row once I have content on a website. This is because making a change in one place has repercussions and other changes are necessary so all of a sudden I may end up making changes to 3 or 4 pages. That’s a lot of pinging if you ask me. Not only that, the WordPress system is not optimized properly and if you update the same page 10 times in a row. it will send 10 pings. Not the best practice! Now if you have 30 services x 10 pings, that’s 300 links appearing in a very short period of time… don’t you think Google would see that as spam by any chance?
Just in case, the list of websites to ping is found under Settings » Writing. It should either be empty or only include the default WordPress link which is: http://rpc.pingomatic.com/.
4. Plagiarism / Scraping (Duplicate Page or Entire Websites)
Some people would not feel ashamed of just going to another website and copying their content verbatim. This is called Plagiarism. This has happened for ages since people like to make money for the least amount of work. So-called Authors at times simply copy the text of other authors verbatim, which makes it sounds really nice (i.e. if you don’t write as well as that other author…)
Note that you can copy a small section from another website and clearly mark it as a quote. This is legal in journals, books, and magazines too and Google does not penalize you for doing such.
There is an apropos example:
“Intentionally using the quotes of others without author attribution is plagiarism and contributes to illiteracy.”
Rain Bojangles
Scraping comes from the concept of plagiarism done automatically by your server. The idea is pretty simple: you can go to many websites, get their content and copy it on your own website, again, verbatim.
By scraping you get tons of content. This happened in part because websites offered Atom and RSS feeds, a really easy way to scrape a website because the actual content would appear right there in the feed (opposed to reading a website pages and having to parse the HTML content to find the actual article—and having to fix that search each time the HTML changes, which is pretty often on websites worth scraping!)
4.1 Is All Duplication Bad?
If your website is being duplicated, then you are actually going to get backlinks (you have internal links between all your pages, correct?) It will be rather small quantities of backlink juice, but none the less, it will actually have a positive effect for you.
Once in a while, it could be that another website takes pole position for one of your posts that they duplicated. In 2014 Google Search offered a form to report that one of your posts was stolen. Since then it was taken down. It’s likely that they implemented an automated system that took all that data and re-used it to feed their Artificial Intelligent (AI) robot and now they don’t need Human intervention to fix their search index.
4.2 How To Find Duplication?
One solution is to search with Google. Take a paragraph and search for it. Duplicates will be exactly the same and will appear first. Then Google shows equivalents that are less and less like your own post.
Another solution is to use a system such as Copyscape. The website lets you enter your website URL and tells you whether they found duplicates by checking their database for your website. Note that they report duplication either way (i.e. whether someone duplicated your website or whether your website includes content you copied from another website.)
As mentioned above, don’t worry when you discover small snippets of your content were duplicated. Actually, you can turn it into a positive by asking for a backlink to your site without the “nofollow”.
4.3 Is there a way to safely post the same content on two or more sites?
Yes.
I suggest you don’t practice that too much. I think it is much better to have unique content for each one of your post, even if you heavily reuse someone else writing, making it your own version with your own words.
Yet, it’s possible to safely duplicate a post as long as your specify where the original content is. You simply have to enter the URL of the original in the Canonical link tag. This tag looks like this:
<link rel="canonical" href="http://www.example.com/seo-5-black-hat-practices-to-avoid/"/>
Here you go. This way the Search Engines know you have a duplicate and you tell them legitimately where the original is so they do not penalize you one bit for having a duplicate.
5. Bad Content (Machine Generated)
So, by now you must have understood that the most important part of a Niche Website, in order to get hits, is its content.
Many people, though, want to make a lot of money quickly and not have to write such long articles explaining how things work in their field (or should I dare to say it as: “their” field, ha! ha!)
So some of them looked into a solution to make it possible to write articles in a completely automated way. For some, they created robots that would scrape websites and tweak the content with synonyms and antonyms. Needless to say, the resulting pages were real crap, I’ve seen quite a few of those and it’s really inconsiderate that anyone would want to pay for such a tool and not get their money back after the first attempt.
Other such robots were built to stuff the content. So you write very brief sentences and the tool would add fluff (I add my own fluff in my content… but if you’re reading this, you’re probably okay with my fluffy content…) This means adding all sorts of words in between words. For example, it could change:
“I drive the car”
into
“I love to drive the nice blue car”
This makes the content length longer and it’s not bad in one sentence, but there was no real consistency within the whole website making it pretty much impossible to read.
I’d bet there are other similar robots that can generate content for you. I am not really interested in them so I probably missed quite a few. Of course, with Artificial Intelligence becoming better and better at replicating human behaviors, such machines could very well become very difficult to detect for Googles own Artificial Intelligent Robots… However, I think we’re still ways away from having such robots being able to come up with a good long article like a human can do.