Why are not all pages indexed by Yandex. Why are some pages not indexed? How to check a page in the search engine index

In this article we will tell you why Yandex may not index a specific page on a website, a section, or the entire project. The problems described can affect indexing in Google and any other search engines Oh. Pages dropping out of the index is a serious problem, since it directly affects the potential number of hits that a resource can receive from search engines.

1. Yandex does not fully index the site

If you have a new website, it will not appear in the search engine right away - it can take from 2-3 weeks to a month. As a rule, Google finds and indexes new projects faster, Yandex slower.

If 2-3 weeks have passed and Yandex still does not index the site, perhaps it simply does not know about it:

  1. Add a link to the site in a special form: http://webmaster.yandex.ru/addurl.xml
  2. Add your site to the Yandex.Webmaster service: http://webmaster.yandex.ru
  3. Add Yandex.Metrica to your website
  4. Place links to the site from social networks and blogs (search engine robots actively index new content in in social networks and blogs);

1.2. The old site has partially or completely dropped out of the index

You will notice this problem by comparing the number of pages on the site and the number of pages in the search engine index (you can view it through the Yandex.Webmaster service).

If most of the pages have dropped out of the index, and only the main page or the main page and several internal ones remain, the problem is most likely in the applied AGS filter. Yandex has considered the content of your site to be non-unique or unattractive to users. The site content needs to be reworked.

In addition to applying filters, Yandex may not index the site for the following reasons:

  1. Accidentally disabled indexing in the Robots.txt file. Check its contents.
  2. The robot cannot access the site due to unstable hosting or the robot’s IP address was accidentally banned (for example, the site reflected a DDOS attack and the search engine robot was mistaken for an attacker).

2. Yandex does not index pages on the site

This problem occurs more often than the entire site falling out of the index. Typical problems in which Yandex may not index a page include:

  1. The search engine robot does not reach the page due to the confusing structure of the site or high level nesting. Try to make it so that you can get to any page of the site with two clicks. Create a sitemap.
  2. The page doesn't have much unique, interesting content, and the search engine doesn't consider it useful to visitors. Check the uniqueness of the text, modify it, make it more useful.
  3. The number of pages on the site exceeds the limit allocated by the search engine for the project. For example, Yandex is ready to index 10 thousand pages from your site, and 15 thousand pages from your project. In any case, 5 thousand pages will not be included in the index. Try to develop the project to increase the limit - and close unnecessary, service pages from indexing to free up space in the index for promoted pages.
  4. There are no links to the page - therefore, the search engine cannot find it.
  5. The page is prohibited from indexing in the Robots.txt file. Check the contents of the file.
  6. The page contains the noindex attribute in the robots meta tag, so it is not indexed by search engines. Check the page code.
  7. The site menu is made in flash, which search engines do not process. Accordingly, Yandex does not index site pages whose links are hidden in flash. It is necessary to create an alternative menu, the links in which will be accessible to search engine robots.

From SiteClinic.

Poor site indexing is one of the serious problems of webmasters. Why is the site or its individual pages still not in the index? To answer this question, you need to do a little analysis. Below is a list of the main reasons for poor indexing, which can be used as a checklist to solve the problem.

Main reasons

There are 5 main reasons why a site or its individual documents may not be included in the index (or have difficulty doing so). Here they are:

— The robot does not know about the site or document
— The site or part of it is inaccessible to the robot
— The site is blacklisted
— There is a technical error
— Some pages or sections show poor quality

Each item listed above is described in detail below.

1. The robot does not know about the site/document

The robot may not know about the site (not include it in its schedule) for various reasons.

- Not much time has passed

In order for the robot to learn about the site or new page, it takes time for it to find a link (internal or external), or when someone visits the site from a browser with a toolbar installed, or you deliberately notify the robot about the appearance of a new document. You can speed up getting a document into the indexing schedule using .

Also, if you already see from the server logs that a robot has visited the site, but the pages have not appeared in the index, you need to wait for an update, which in Yandex happens no more than 2 times a week (and on holidays it reaches 1 time every 2-4 weeks ). Pages can get into the main index in a few hours in Google and at least 2 weeks in Yandex (if they get in earlier, then a quick bot has most likely crawled through the document and later the pages may temporarily leave the index before the main robot arrives).

— The site/document is not worth links

If the site is rarely updated, then the robot will also rarely visit it, and when adding new pages, you need to make sure that they have links from the main page, or you can put links from other external resources(for the purpose of prompting the robot about new documents).

2. The site or part of it is inaccessible to the robot

Even if the search engine already knows about the site, we can consciously or unconsciously deny it access to certain sections and documents.

— The domain is not delegated (or removed from delegation due to a complaint)

Make sure that the domain you purchase is delegated and accessible by domain name not only to you, but also to other network users. Ask your friends from another city to visit the site and check if it opens.

— Slow document upload speed

Slow document delivery speed due to hosting or CMS problems will not allow the robot to quickly index the site. It will continue to scan it, but not as quickly as if the documents were sent instantly. Simply optimizing your site's loading speed can significantly improve its indexing.

Of course, there are other reasons for poor site indexing. If none of the above signs suits you, you need to contact the search engine support service or contact specialists.

If you have encountered any other problems that prevent your site from being indexed properly, share them in the comments!

Many users, especially beginners, ask the same questions on various forums, websites - why are pages not indexed by search engines Yandex andGoogle, or - why is the site poorly indexed? or not at all The site has stopped being indexed. In this regard, huge disputes are unfolding, since there may be a lot of reasons for this and it will not be possible to give an answer right away; you will have to analyze your resource and identify a specific reason.

Therefore, let's look at the most common reasons why your site may not be indexed, or may fall out of the index completely.

Why is the site not indexed by search engines?

Speed ​​up indexing. Do not forget that in order for your site to be indexed faster, you need to add bookmarks and other services.

You can also add a link to the addurilka in the search engine, but some say that in this case, the search engine will put your site at the end of the queue for indexing. Although, if a search robot has not visited your site for a long time, then it is worth doing so.

Filerobots.txt. Perhaps your robots.txt file is not compiled correctly, so you need to review it (perhaps the page in it is closed for indexing or the entire site in general).

Meta tags. Perhaps your page is blocked from indexing using meta tags such as: none, noindex, nofollow, and so on. The situation needs to be checked and corrected.

Virus on the site. It may be that a virus has settled on your site, and accordingly it has dropped out of the search results, or is not indexed at all, so you should.

Structure. Your website structure is completely wrong. The three-click rule may have been broken. That is, to get to a specific article on the site (page), you need to make more than three clicks.

Hosting. It may be that yours is carrying out some work, and therefore the search engine cannot access the site (the search robot cannot access the site) and errors of this kind occur. If the Yandex search robot cannot access the site, then read.

Domain. It may also be that you bought your domain, which was sanctioned by search engines. You need to check the domain name before purchasing it.

Site content. The most common reason why your site is not indexed is that it contains non-unique content. You should write high-quality and unique content for the site, and not copy from others.

How to check a page in the search engine index?

There are a lot of options, a lot of services, but the simplest one is to copy part of the article (its title or URL) and paste it into the search bar and find it. If your page appears on the first pages of the search, it means it is indexed; if not, then alas.

From the author: you write useful and interesting content. Or maybe it’s useless and uninteresting - that’s your business. But why is the site not indexed in Yandex for a long time? Let's look at all the main reasons.

Look at this picture. Yes, you've probably seen her more than once. Indeed, everything can be found in Yandex. But what is not indexed will never be found, because the search engine simply does not know about it. And if so, then the issue of indexing is very important for any webmaster, site owner and optimizer. Today I will write about at least a few reasons why you may have problems with indexing.

The first and most likely reason is the age of the site.

The fact is that most webmasters who are concerned about the issue of fast indexing are owners of young sites. I am confident in this because I myself developed two sites from scratch from the very first days of their existence. You need to understand that Yandex treats newly hatched sites with distrust, and one cannot blame it for this - too many GS have appeared.

So, the younger your site, the less chance of quick indexing of site pages. As a rule, you can get it into the index more or less quickly home page, since it has the highest priority, but the rest, in any case, will not get into the index so quickly.

If you want to index at least the main page, add the site to Yandex.Webmaster if you have not already done so. Also add the project url to addurl (Report a new site). After confirming the rights to the site, I also recommend immediately doing the following:

Upload an xml sitemap to the webmaster (this can directly improve indexing, since the search robot will be better aware of all available pages).

Download and check the robots.txt file. It can be checked for errors.

Set up the main mirror of the site so that there are no duplicates with and without www

This is something that will directly or indirectly affect the indexing speed. If you have done all this, all you have to do is wait for indexing, adding all new project urls using the same addurl.

Okay, maybe you've done all this and still can't wait for indexing. Is there really no way to speed up the process? Here you need to understand that much does not depend on you. Yandex indexes much slower than Google by default. Especially young sites. Usually within 1-3 weeks new pages of young sites appear in the index. Consider this time period quite normal.

I just want to convey to you that if the reason for slow indexing is the age of the site, then you shouldn’t worry too much. Over time, your project will grow and turn from a baby into an ordinary Internet resource, well known to search engines. Then the time will come when new materials will fly into the index quickly, within a few days.

I noticed this situation on two young sites that I promoted. When the site's age exceeded 6 months, the entry speed new article in the index increased. According to Yandex itself, the longest indexing period is 2 weeks. If they pass and there is nothing in the index, then there may be other problems or errors. Of course, you shouldn’t wait 2 weeks, but it’s better to check everything as soon as you can. Next I will consider the most common mistakes and factors due to which the site or its individual pages may not be included in the index.

Prohibition of indexing in robots.txt

This is an important and useful file, but it may contain errors if you wrote the instructions yourself or took them from the first site you came across. The most important thing is that there is no such command: Disallow: /

Also check what specific commands are given for the Yandex bot (user-agent: Yandex). It's unlikely that the problem is in this file, but it would be stupid not to check it.

Using the robots meta tag

By the way, in robots.txt itself you may not find any dangerous commands that would block crawling to the site and its specific page. But in addition to this, there is also a robots meta tag, with which you can block a specific page from indexing, as well as links on it.

< meta name = ”robots ”content = ”noindex , nofollow ”>

For example, such a piece of code clearly indicates that the page will be closed. Check if you have it somewhere in your header. If you want to close all links, but open the text (then the page will be indexed), use the parameters index, nofollow, and if you want to open everything - index, follow.

Prohibition of indexing in engine settings

As far as I remember, before installing WordPress, you can check the box to not have your site indexed in search engines. This can be useful for the first time on young sites that are under development, so that all sorts of test pages do not end up in the index. You may have checked this box accidentally during installation or later from the settings. In this case, just remove it.

Incorrect use of noindex tag

The noindex tag is a useful tag, but a dangerous one if used incorrectly. It allows you to block text from indexing. That is, the text placed in it will not be perceived by the search engine. Accordingly, if you accidentally wrap the entire article in noindex, it will not be indexed, since there is nothing to index (everything is closed).

You can hardly make such a mistake, but the tag will creep into the template files themselves. Perhaps it opens somewhere in your header.php, and then does not close anywhere. It turns out to be a funny situation that everything is closed. In other words, check the code of your site. In fact, you don’t have to be a programmer or layout designer to do this.

You can use a program like Archivist to quick search for many files (enter noindex), or use the standard file search. The debugger can also help (F12 in Google Chrome), but this is more for developers. Additionally, you can check each specific page. Did it happen that you wanted to close some non-unique piece of text using noindex, but did not close the tag in the right place?

Search engine filters

IN in this case We are interested in filters from Yandex. Although you should understand that Google also has them, and they may also be the reason why your site is not included in the index in this search engine.

In particular, all young sites have a sandbox filter. This is not a problem, this is normal. Just as little children play in the sandbox, so young and green Internet projects also end up there. No, you don’t need to go around the whole city looking for your site, just know that the indexing speed of young sites is lower. I have already spoken about this above.

But the sandbox is a temporary measure for the search engine. And in general, this is not a terrible filter, because it applies to everyone. Another thing is sanctions for using unfair methods when promoting a site. Most often this is hidden or obvious spam keywords in the text, as well as in meta tags. In other words, play by the rules of search engines, and you won’t have such problems.

Bad domain history

This point is relevant for those who bought a domain and started developing their new project on it, or for those who registered a domain abandoned by the previous owner.

Of course, before purchasing any domain, I recommend that you do your research. This can be partially done using services such as WebArchive (site snapshots) and linkpad.ru (link mass analysis). From the pictures you can determine what the topic of the site was and what kind of content was on it. You can also roughly understand the topic from the reference mass. Also check the domain for search engine filters.

In general, this little analysis will save you from buying a domain with a bad history. If you nevertheless purchased one, perhaps with the help of requests in technical support. support will be restored over time. But it’s better to start from scratch than like this.

Site language

You must understand that Yandex is primarily focused on the CIS countries. If you want the pages of English, German and other foreign sites to be indexed, then this is quite possible, but only if the content is of high quality. For foreign sites there is Google and many other search engines, so that's your first place.

The percentage of uniqueness of your texts

There's another one good reason, why your site may not be indexed by Yandex. This reason is non-unique content. For example, you copied text from another source and pasted it into your own without changing it in any way. Practice shows that Yandex may not index completely non-unique content at all and may not pay attention to pages with less than 50% uniqueness for a long time. It is better to have at least 90% uniqueness.

To better understand why non-unique content has a much lower chance of being indexed successfully, you need to roughly understand the algorithm by which new pages get into the index. In particular, before hitting the robot, some analysis of the content is carried out and if it turns out to be of low quality, the page simply may not be included in the index. In order for it to get through, it is necessary to improve the quality of the text.

Server errors

This last reason which we will consider. It is purely technical. In particular, one of the indicators of the quality and reliability of your resource is that it is constantly available online 24 hours a day, every day. Of course, there may be minor glitches and that's normal. But when your site goes down for several hours (or, even worse, days), the search robot, having arrived at new pages, simply will not be able to open them and will be forced to leave.

Thus, those pages that were already there may even fall out of the index. How to prevent all this from happening? First of all, choose a good hoster and a more or less normal tariff. There is no point in saving money on your website. It’s better to pay 300-500 rubles a month than 50.

Also, if you buy VPS/VDS plans, then you will most likely have to configure the server yourself. In particular, choose the OS of the virtual machine, install some utilities, etc. If you are completely new to server setup, it is better not to buy VDS yet or choose a tariff where you are guaranteed technical support. support from the provider.

In particular, server errors include all codes 4**, 5** (three digits with a four and a five at the beginning). For example, due to lack of optimization of database operation, an error may occur. Especially if there are dozens or hundreds of people walking around the site, and you have few resources.

Bottom line

Well, we've covered all the main reasons. Now you know what to do when a site is not indexed in Yandex. If you have checked absolutely everything and found no errors or problems, then you just need to wait. As I already said, Yandex does not index as quickly as Google. The speed of entering the index ranges from 1 day to 3 weeks.

Well, you can find other tips not only for indexing, but also for blog promotion and promotion in general in. Believe me, there are quite a lot of these chips, so tomorrow (if you suddenly start studying the course) you will be shocked at how many things you didn’t know!

But knowledge saves us from troubles when promoting a website. If so, read Webformyself, subscribe to us where it is convenient for you, and see you soon!


15.2 What to do if the site is not indexed.

The site is not indexed. What to do?
I wrote a lot that a good site will not encounter indexing problems, and yet it was mine Crimea-Blog , with which I illustrate all the examples in this textbook, did not want to be indexed by Google for a very long time. So I thoroughly studied the question from the title in practice.

Crimea-Blog was created on CMS Joomla, which manages millions of resources. No bells and whistles, unique texts, good links - all the conditions for perfect indexing. And despite this, Google added one page every three days for the first two months.

At the same time, it’s paradoxical: positions and traffic from Google were already available, but there were only 3-5 pages in the index. Logic breaks down into small pieces, but the fact is that it was so.

So what should you do if your site is not indexed? First of all, don't panic. Studying forums with similar questions, I saw that the problem is more than solvable. What’s good about forums is that you read a cry for help from a year ago, “the site has not been indexed for 3 months!” You go into the search and check the indexing of the specified resource. And you see - everything is already in order, which means the webmaster has solved the problems. And this was the case in almost all cases, which undoubtedly inspires optimism.

1. Publishing a link on a highly visited site . It works flawlessly, the published page flies into the index a couple of hours after publication. The disadvantage is that it is very expensive, and only the very link that we have posted is guaranteed to be indexed.

This may or may not encourage indexing of other pages.

The disadvantages are the same - if the accounts are good, then publishing on them costs money. But it will still cost significantly less than the first method.

3. Run on social media. bookmarks . I haven’t tried it myself, but they say it also works well to speed up indexing.

Advantages - cheap. Disadvantages – social The bookmarks are just insanely spammed, I don’t really like publishing my resources in such an environment. Although there should be no harm from this.

4. View like GoogleBot . Great way for Google. Go to the webmasters panel, section Crawling -> View as Google bot. And there you can send a request for forced indexing of the selected page. In less than a day, the page is almost guaranteed to appear in the index.


There really is a limitation here - you won’t be able to send more than 10 pages at a time. True, this number is “renewable”. Those. you have used the entire limit for sending to the index, and after a week this opportunity becomes available again.

When you click on the “send to index” button, you have a choice between “send this page” and “send this page and all related pages”. In the second case, we ask Google to index not only the given URL, but also follow links from it to other sections of the site.


In my experience, the submitted page is always indexed, but Google may not follow the links on it.

5. Xml site map . Sitemaps originally appeared in the distant past. Their goal was to make the site easier to navigate. The map usually contained structured information about sections of the resource and made it possible to quickly navigate to them.

For search engines, a site map is also a benefit, because... Links to all pages are collected in one place.

In my opinion, this feature is mega-useful for sites with tens of thousands of pages. If you only have a couple hundred of them, then you don’t have to bother with an xml sitemap.

But judging by the reviews, adding a sitemap allows you to speed up indexing. My personal experience doesn’t confirm this, but adding an xml map definitely won’t make things worse.

Well, I’ll repeat my final advice once again - don’t rush, sooner or later search engines will definitely index your resource.

My book was published in paper version. If this tutorial turned out to be useful for you, then you can thank me not only morally, but also in quite tangible ways.
To do this you need to go to

Views