Google is a politically active organisation which truly represents the 1984 “Big brother”: invading everyone’s life and socially manipulating society towards its own vile politics & profit.
Before I realised that their political manipulation of search results wasn’t just a mistake but an intentional policy, I tried to get my sites onto their vile search engine. Today, I’ve decided I will now try to block big-brother-google from accessing my site.
If you’re not convinced bing “project veritas” where a google insider reveals they have planned to stop Trump getting re-elected. And this is not an odd employee dabbling to bury one site. This is an internal policy of big-brother-google to bury ALL sites which it politically dislikes.
So, I’ve been looking into how to block Google
Robots.txt
The simplest solution is to add a file to the home directory called “robots.txt” and insert into it the text:
User-Agent: Googlebot Disallow: /
This however is a only a request to Google to F-off and they still list the site when they feel like it.
Metatag
Another solution is to add a meta tag:
<meta name="robots" content="noindex, nofollow">
According to a big-brother-google aparachik this means “Google will completely drop the page from our search results, even if other pages link to it.” This however is again only a request to big-brother-google to F-off and it won’t stop them gaining access to your site to train its AI bots to recognise the sites of political opponents which it will then bury.
htaccess
A better way to do it is to use htaccess. This tells your own server to check the user agent identified by the search and if it corresponds to Googlebot, to redirect all requests to a “not found” (although a google F-Off might be more appropriate). This doesn’t rely on the “good faith” of Big-brother-google which is like asking the wicked witch of the east to be a child minder.
To block them put the following code in your htaccess.
RewriteEngine on RewriteCond %{HTTP_USER_AGENT} Googlebot RewriteRule ^.*$ "http\:\/\/htmlremix\.com" [R=301,L]
It is however easy for Google to simply change the User agent. And they will start doing this if a lot of people start blocking them in order to gather a “signature” of these sites which will mean they then just bury any other similar site. So, an even better way is to block all Google IP addresses. Which if you bing it you will find are something like:
- 64.233.160.0 – 64.233.191.255
- 66.102.0.0 – 66.102.15.255
- 66.249.64.0 – 66.249.95.255
- 72.14.192.0 – 72.14.255.255
- 74.125.0.0 – 74.125.255.255
- 209.85.128.0 – 209.85.255.255
- 216.239.32.0 – 216.239.63.255
This can often be blocked through your control panel. This means even if the same servers are changed to try to hide that it’s google bugging you, they are still blocked by their IP address. The problem is that they can easily obtain an IP address.
Password protection
The only secure way to block google entirely is to password protect your site.
My Choice
After finding that I can’t add a meta tag without adding another plugin (Yoast SEO), I’ve decided to add a simple robots.txt