How to block search bots from accessing your site?

In order to block all bots from accessing your site, you should create a robots.txt file with the following content:

User-agent: *
Disallow: /

To block all bots from accessing a specific folder, you should create a robots.txt file with the following content:

User-agent: *
Disallow: /folder/

Please note that blocking all bots will get your site deindexed from search engines.

  • 0 Users Found This Useful
Was this answer helpful?

Related Articles

What is Spam?

Spam refers to junk email that's sent out in mass quantities. On average, three-fourths of the...

What are PHP, Perl, Python, and Ruby on Rails?

Those are all web programming languages, also known as scripting languages. PHP is the most...

How many email accounts do I need?

That's really up to you; it depends on how many people you think will be using email on your...

How much disk space and bandwidth do I need for my website?

You probably understand what disk space is, but what is bandwidth? A bandwidth quota is a limit...

Clear cache in Firefox, Internet Explorer, Safari, Opera and Chrome

Here are instructions on how to clear your cache in the following browsers: Firefox: Open...