In this article, we’ll address the most commonly used manipulative or short-term techniques of building links and improving website’s ranking on Google in the past, and how they can be used qualitatively and legitimately today.
Manipulative link building is any action you take solely to obtain a link from a website and thus, improve the position of your website in Google search. It’s really possible to use manipulative techniques to get a website ranked for certain keywords. However, websites that use such link building methods are ultimately penalized by losing their positions in the SERP (search engine results pages) or disappearing completely from Google search. That’s why we think they are short-term techniques.
Be aware that a non-reliable web host can bring down your SERP, apart from affecting a website’s overall performance. It happens when your website frequently suffers from web server downtime. It’s the first sign of a non-reliable web host. When a website is down, it doesn’t show up on the search engine. MySQL hosting can help you with regard to this. Customers can expect top-notch features selecting among a list of options that different hosting plans offer. MySQL hosting is built in a reliable way that allows you to check server uptime, which is mysqladmin (its administration program).
Short-Term Methods of Link Building
To clarify what short-term (manipulative) link building methods are, we’ll first write about Google’s vision and their recommendations for website owners. Then we’ll write about manipulative link building techniques, how they were used in the past, and how they can be used today.
Finally, we’ll direct you to a few quality texts on how it’s desirable to do SEO today and how to approach website optimization for search engines, in general.
Google’s Long-Term Vision
For starters, it’s important to understand why Google wants to rank certain websites, and what’s its goal? Here are some of the highlights of Google’s philosophy:
- Focus on the user – create the best resource for the end-user, i.e., the reader,
- Democracy on the Internet succeeds. Google has so far (and still does) used links as the main metric for determining how specific a website is, i.e., page on it relevant to a particular topic. This is slowly changing,
- The world is increasingly relying on mobile technologies. People want to get information wherever they are. Therefore, it’s essential that your website can be easily read from the desktop computer, as well as from mobile phones and tablets,
- Do no evil. Money can be earned by legitimate techniques, i.e., those that bring value to readers (see #1).
Google suggests that you follow the recommendations in its “Webmaster quality guidelines” section – a section intended for website owners. Therefore, the website should:
- be made for users,
- not deceive users,
- not use manipulative techniques to improve its position on Google. Ask yourself if that would help your readers and would you leave a link if there were no search engines?”
- stand out from other websites. Strive to be unique, to have value for the readers, and to create a commune around them.
You should also react in a timely manner if your website has been hacked and shouldn’t allow users to spam your website. Things you should never do are stealing content from other websites, using texts automatically generated by robots, spamming with rich snippets, and using affiliate programs if you don’t add value to the product you’re selling as an affiliate (e.g. text or video of a product).
A Look Into the Past – a History of Spam and Manipulative Link Building Techniques
What follows is an overview of the most common link building techniques used by spammers. These techniques are called black-hat and gray-hat, while the legitimate ones are called white-hat (or ethical techniques).
We’ll explain, of course, how you can bypass spam for the sake of positioning the website on search engines, i.e. how to use these same black-hat techniques legitimately.
IN THE PAST: Cloaking, or showing one content to website visitors and another to search engines, is by no means not recommended. In the past, spammers used this method by inserting a bunch of irrelevant expressions and words into a page’s code so that search engines would rank them for all those words. Readers would be shown a page that looks completely normal and they wouldn’t even see “excessive” words from the page code.
TODAY: Some types of practice can still be used today. For example, when showing different content to visitors from different countries.
Keyword stuffing is a process of frantically inserting keywords and phrases in the text.
IN THE PAST: At the very beginning of search engine optimization, the page with the most frequently mentioned keyword you wanted to rank for was also ranked for that keyword.
Later, great attention was paid to the percentage of the phrase for which the ranking was to be found in the text, so a keyword or phrase had to be mentioned a certain number of times in order to be represented in 3% to 5% of the article and thus, give Google a signal that the text definitely writes about that phrase.
For example, if you want to rank for a specific word, ideally, that keyword would be mentioned three to five times in a 100-word text. By that logic, you should mention a keyword between 30 and 50 times in a quality blog (every blog that pretends to be relevant should have at least 1,000 words). We think that would destroy such blogs, and Google realized that as well.
TODAY: Google increasingly recognizes the connections between certain phrases and words. So today, if you want to rank for a particular phrase, you can also use a descriptive phrase.
For small businesses, it was important that their keyword was exactly what visitors typed into Google. So, we came to the point of typing unnatural phrases for which websites tried to rank. Google increasingly understands the language we use and the intent behind the words we type in a search. That’s why the Hummingbird update was launched a few years ago.
Today, the same page should be optimized for a set of keywords, phrases, and synonyms (by mentioning them in the text), which would be ranked based on the authority of your domain.
IN THE PAST: It used to be important to put keywords meta tags that describe the page in the code of that page. Google was taking this into account for SEO.
TODAY: Meta keywords have no impact on SEO. Meta description tags can affect how many people click on your website, based on what they see in the description. The meta title tag is essential for SEO along with various important SEO tools that bring benefits to your website. With the meta robots tag, we can tell search engines that we don’t want them to index a particular page from our website. It’s recommended to put the keywords and phrases for which you intend to rank in these meta tags (except in the keywords meta tag). Feel free to write all the meta tags.
IN THE PAST: The number of links was crucial. The owners of the websites have realized that by cooperating with others, they can exchange links by the system: “I link you, you link me”. The links were mostly placed in the footer or sidebar of the website. This is one of the first practices that Google listed as a manipulative technique.
TODAY: If you like a website and follow it regularly, place a link to an article from that website in your text. Make sure the link in your text makes sense and don’t expect the link to come back to you.
IN THE PAST: Mass commenting on blogs was easy and could be automated with various tools. That way, you got a huge number of links to your website, and as a rule, the comments were meaningless, because people who commented for the sake of the link usually didn’t even read the text. At that time, to rank a text for a particular word, all you had to do was put that word as your name when commenting on a blog, and leave a link to your article.
In 2005, Google devised a no-follow tag that could be added to links just to combat spam in comments to blog posts. A no-follow tag added to a link would signal to Google and other search engines that the website owner doesn’t trust that link, and Google allegedly doesn’t count no-follow links.
TODAY: A quality comment on a blog post is still desirable today. Many people will often visit your website for quality commentary. Blogs get a decent visit from comments on other websites. A good rule is – consider if you’d leave a comment without being able to leave a link. If you find that the answer is YES – feel free to comment.
IN THE PAST: 301 redirects are used for many legitimate purposes. It allowed you to automatically switch from one website or page to another. Spammers used this by redirecting their websites to strong portals like Wikipedia. When Google read the spam website, it actually read Wikipedia, and then the spam website apparently got a page rank the same as the website it was redirected to. After that, spammers would sell links on those websites with a “high page rank”.
TODAY: 301 redirects can be used quite legitimately. For example, this way you can redirect the website to another domain that you own so that readers who type in the old domain end up on a new blog, rather than on a blank page. Affiliate marketers often use 301 redirects, too.
IN THE PAST: Achilles heel of Google. Of course, Google doesn’t support paying for links precisely because it’s a way of manipulating which you can relatively easily rank for some phrases. That is unless Google catches you in doing so.
TODAY: Even today, this practice is widespread. If you sell advertising space within the text on your website (native advertising or sponsored text), all links in that text should be marked with a no-follow tag, so that Google knows that the text is sponsored and to give a minimum value to the link from that website. This practice is fine to apply when you want to reach visitors who regularly read a given blog.
Link Schemes or Link Networks
IN THE PAST: Not every website links to every other website on the web (Google would notice this very easily), but everything is designed to look completely natural. Since all link networks are position manipulations on Google, websites that use them are penalized as soon as Google detects them.
Since it’s disastrous for most websites to have a large number of links from forum signatures, blog comments, or article directories, webmasters have decided to send these low-quality links to high-ranking websites, such as YouTube. Many spammers poorly link to their YouTube video to have a better chance of ranking on the first page of Google for the desired keyword.
TODAY: Link schemes are still used, but to a much lesser extent compared to the situation a couple of years ago. Google is really good at detecting them. However, there are still link networks that thrive, mostly private, and all they had to do was make their networks much bigger than it was before.