What are google filters? For what you can get under the filter

Search engine filters are a part of the algorithm that is responsible for ensuring that low-quality or using incorrect promotion methods are not shown in SEO results. That is, it is a kind of “fines and punishments” for bad sites from the point of view of Google.

And from the point of view of the site owner, this is “collapse and horror”, because a lot of money and efforts were invested in the site, which, due to wrong actions, “went down the drain”.

What are the filters, how to diagnose and prevent them, and how to get the site out of them – read below.

* * *

Diagnostics of filters is complicated by the fact that if earlier they were “superimposed” at some certain moment and they could be diagnosed by a sharp drop in traffic, now they have become part of the search algorithm, and simply do not allow low-quality sites to grow. That is, the site does not seem to drastically lose traffic, but it does not grow either. For this reason, we can observe a sharp drop in traffic only when fundamentally new filters appear, which will then become part of the algorithm and will act according to the principle of a “glass ceiling”

Algorithm changes are a change in the basic formula and general principles of site ranking, which affects all sites in general and changes the overall picture of search results. And if we are just trying to avoid filters, then we should take into account changes in algorithms in the formation of an overall SEO strategy.

* * *

First, you need to figure out why a site can dramatically lose traffic and positions. In addition to filters, there may be other reasons:

  • the site changed the domain, structure or URL names without correct redirection. Accordingly, the old pages dropped out of the index, and the new ones have not yet got there.
  • the site was unavailable at the hosting level for some time. If it has been unavailable for a long time, then even after the restoration of working capacity, positions and traffic may not return.
  • viruses and hacks. If google detects malware or loads of pages or outbound links with “prohibited” topics, it will remove your site from its index. The chances of recovery are directly proportional to the time the site is infected.

Google Panda filter

Panda is a filter for spammed content. Sites that use non-unique texts or use too many keywords are at risk. This filter is now included in the main search algorithm and applies to the entire site.

Sites may be affected that:

  • copy content from other sites – moreover, sanctions can go both for full copying and for too weak uniqueness
  • insert a lot of keywords – it is recommended to enter 1 main key in 300-500 characters
  • write keywords in an unnatural form – all these “buy a laptop Kiev”
  • a large number of non-thematic advertisements
  • lack of content structure – the very “sheets of text that no one reads”
  • publish texts with a lot of spelling and punctuation errors

At the same time, “panda” affects the entire site, and for violations on some pages, google can pessimize the site completely. For serious violations, for example, for completely copied texts, google can completely remove the entire site from the index.

Also, one of the symptoms of overlaying this filter is a situation when a piece of text on the site is not searched for broad match in google.

* * *

How not to get hit by the Panda (and at the same time what to do if you have already hit)

  • do not copy other people’s texts to your site, even if these are pages of the “delivery and payment” type
  • if, when writing a text, you take someone else’s text as a basis (and who does not?), then make sure that your text is absolutely not similar to the original source
  • design structured pages: blocks, subheadings, lists, tables, illustrations
  • follow the variety of linking and anchor texts of internal links
  • look through Google Analytics which pages receive the least traffic, and modify the texts on these pages, or delete them completely
  • add new materials and update old texts

Depending on the number of “crimes” on the site, removing the filter may take a different amount of time – from a week to several months. Most often, changes happen only with the next update of the algorithm. In the event that a domain is no longer fully indexed, a complete domain change will most likely be required.

Hummingbird algorithm

Hummingbird is an algorithm designed to recognize the meaning of a user’s search queries and offer him the most relevant content. Websites can suffer from this algorithm if they use artificial text optimization for irrelevant search queries.

But at the same time, as a result of this algorithm, google can show the user content relevant to his request, even if there is no exact occurrence in the text, but, for example, synonyms are used or the necessary concept is described in other words.

The development of Hummingbird was the update of the Rank Brain algorithm, which, through the mechanism of machine learning and artificial intelligence, analyzes the semantic content of pages, the search behavior of users, and gradually begins to rank higher those sites that better respond to users.

Google Penguin filter

Penguin is a filter that penalizes placing artificial links on the site. From the point of view of Google, any link that was influenced by the optimizer or the business owner is considered artificial. It doesn’t matter if this link is purchased or free. If Google decides that it was not your grateful client or freelance journalist who posted it, you are at risk. A case is known when a large American flower salon suffered from a penguin after a massive request from its customers to put a link to their website on their resources.

After 2016, the “penguin” no longer acts as a filter, but as part of the main google algorithm. And for “bad links” the site is likely not to reach the top or receive a slight decrease (by 5-10 positions). As a rule, the penguin acts pointwise on pages with low-quality links and does not completely affect the site.

Such link strategies are at risk

  • anchor links – links that contain a keyword in the text
  • links from low-quality sites – not created for people, but for selling links
  • link explosion – a sharp increase in the link mass for no particular reason
  • duplicate anchors – the same anchor links, especially if they contain the keyword
  • links from the footer, side parts of sites – links that are outside the main content area
  • links from non-thematic and non-regional donors – from sites whose topic and region / language differs significantly from the promoted one
  • placing links from a PBN-grid of sites, created specifically for pumping your site with link mass
  • using services that automatically link to your site

If you find low-quality links on your site, we recommend disavowing them using the Disavow tool. This procedure is now being questioned, as google representatives argue that low-quality links are simply not counted, but in this matter, a precaution is never superfluous.

How to check the quality of links:

  1. Manually. We unload the entire list of donor domains through the ahrefs service, in which they are sorted by the Domain Rank indicator by default. In the “risk zone” there are links with DR below 20. We look through them manually and ask the webmasters to remove them if the donor is of poor quality. Or we reject it through the Disallow Tools service

2. Check Trust service. It determines the quality of the link mass by its algorithms. It is recommended to include different types of checks in order to consistently analyze links from the worst to the highest. With the found bad links, we do the same as in the paragraph above.

At the same time, you should also remove or reject links gradually, because abrupt removal of links can worsen the situation.

Google Sandbox

The Sandbox is more of a mythological filter than a real part of the algorithm. New sites don’t really usually get traffic right away, but that’s because the new site has yet to gain link mass and gain trust from the search engines.

In order to sharply enter the TOP of Google with a new website, we recommend paying attention to 2 things:

  1. The quality of the article. In order for the new site to gain an advantage over the old ones, its quality must be an order of magnitude higher than those that already occupy the TOP.
  2. Reference mass. For a sharp increase in traffic, it is better to make a website not on freshly registered domains, but on drops – domains with history and link mass. Having previously checked everything for quality.

Your Money Your Life filter, or “medical update”

Your Money Your Life, abbreviated as YMYL, is a change in the ranking algorithm for sites that fit into the category of “especially important to a person” that can significantly affect his life. These are primarily medical sites, due to which this algorithm is mistakenly called “honey update”, but also finance, news, social issues, online stores and other types of resources.

In this update, Google began to pay attention not only to relevance, but also to the authority of the content. In other words, the main test question that you should ask when optimizing sites in this niche is “can I trust this site”.

These types of sites are often manually assessed by Google assessors, with a particular focus on who is the author of the content, whether they are a recognized expert in the industry, and whether the data is in line with an accepted paradigm.

If we decompose this parameter into criteria, we get the following list:

  • content must be verified – that is, contain verified data
  • content must contain quotes or be authored by recognized industry experts
  • unique comprehensive information about a topic or product – description, characteristics, reviews, photos, video reviews, etc.
  • the site must have relevant and updated information – both in the context of a specific article and on the site as a whole
  • convenient technical solutions – navigation, selection, purchase process
  • the page should be conveniently and understandably designed, with a well-developed structure and illustrative material
  • the text should not contain grammatical, stylistic and spelling errors
  • the site must contain data confirming your expertise – certificates, diplomas, reviews, data about your clients
  • clear and complete contact information, communication methods, terms of purchase and delivery, guarantees, etc.
  • contacts of all representative offices with the names of managers and other contact persons
  • all affiliate accounts in Google My Business

Some factors are difficult to artificially cheat. For example, an author’s expertise is determined by his presence in the media field – publications and mentions on other well-known sites and social networks.

Also, one of the factors is the reputation of the site as a whole – reviews about its activities on special reviews and other resources.

Other reasons for pessimizing sites

The google search engine can lower your site’s performance for such flaws:

  1. Too low download speed – checked by the service https://developers.google.com/speed/pagespeed/insights/?hl=RU
  2. Low semantic value and novelty of texts
  3. A large number of internal broken links
  4. Selling links from site pages
  5. Cloaking – placement of different text for users and for search engines
  6. Artificial “asterisks” of the rating, which are displayed in the snippet and other spam in the micro-markup
  7. Lack of a mobile version of the site
  8. Hosting pirated content
  9. Complaints from other users about pirated content on your site (DMSA complaints)

And these are just some of the possible reasons for the subsidence of sites. In reality, for understanding, you need to consider each site separately.

The severity of the filters also depends on the geo-targeting of the promotion. The most difficult, where the strongest filtering for spam methods, is the market of the USA and Canada. Therefore, there are absolutely no paid link exchanges in these countries, and the efforts of SEO specialists are aimed at creating high-quality content, informative landing pages and outreach links.

In fact, the filters monitor the implementation of the principles of “white SEO” – make sites for people and do not buy spammy links.

Checking the site for Google filters

In addition to the methods for each type of filter that we described above, you can use the following services

  1. Timeline for updating Google filters from the analytical company Moz – https://moz.com/google-algorithm-change
  2. Panguin SEO Tools by Barracuda – https://barracuda.digital/panguin-seo-tool/

Conclusion

There should be deep conclusions here, but the article turned out to be already cumbersome. Therefore, we will only write that filters are much easier to avoid than to “cure” the site from the consequences. So, the safety of promotion is one of the prerequisites for any SEO-optimizer that works for a long time and with white niches.

    I want to send a request