How to block search bots from accessing your site?

In order to block all bots from accessing your site, you should create a robots.txt file with the following content:

User-agent: *
Disallow: /

To block all bots from accessing a specific folder, you should create a robots.txt file with the following content:

User-agent: *
Disallow: /folder/

Please note that blocking all bots will get your site deindexed from search engines.

Was this answer helpful?

 Print this Article

Also Read

Should I pay for private domain registration?

If you've watched the previous tutorial, you may be worried about your personal domain...

What is a control panel?

The control panel is an essential part of any web hosting account. You wouldn't be able to do...

Should I be taking backups of my account? If so, how often?

Backups are crucial to the smooth operation of any website, no matter how important, how many...

How to ping in Windows, Linux and Mac OS?

Here are instructions on how to perform a ping check on different operating systems:  Windows:...

How much disk space and bandwidth do I need for my website?

You probably understand what disk space is, but what is bandwidth? A bandwidth quota is a limit...