For correct indexing of a site, it is necessary to use a file robots.txt. This file will allow you to hide individual folders or files from your site from search bots. This removes unnecessary from the index, as well as increases the speed of the site by the bot. In the case of sites on WordPress, the correct robots.txt looks like this:
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-trackback Disallow: /wp-feed Disallow: /wp-comments Disallow: /wp-content/plugins Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /wp-register.php Disallow: */trackback Disallow: */feed Disallow: /cgi-bin Disallow: *?s= Host: you-site.ru
Do not forget to specify your domain in the last line. These settings are saved in a file named robots.txt and copy it to the root folder of the site. Also, we recommend reading the article about an alternative option for creating robots.txt for WordPress , here each line is described in detail and several working options are given.