A few days ago, I discovered that my website was subjected to a large number of malicious and targeted scans by some IP addresses, attempting to obtain some internal configuration files and information on the website through brute force detection. I used .Htaccess to mitigate the attack, and added the following configuration to the .Htaccess file: order allow,deny deny from 180.97.106. allow from all .Htaccess is a very powerful configuration file for a website. The more you understand its functions, the easier it will be for you to control your website configuration. Using .Htaccess to prohibit a certain IP from accessing a website is one of its basic functions. The above configuration is just one of the usages. Below I will summarize more usages under this related topic. Block access to specified IP order allow,deny deny from 192.168.44.201 deny from 224.39.163.12 deny from 172.16.7.92 allow from all The above code shows how to block 3 different IP addresses from accessing the website. Block access to specified IP segments If you have a lot of IPs to ban and find it too troublesome to specify them one by one, here is how to ban an IP range at a time: order allow,deny deny from 192.168. deny from 10.0.0. allow from all Block access to specified domain names order allow,deny deny from some-evil-isp.com deny from subdomain.another-evil-isp.com allow from all The above code can block access to a website from a specific ISP. Use .Htaccess to block bots and spiders In China, I think the only search engines you need are Google and Baidu. Other small search engines, such as Sogou, 360, etc. can be ignored. Otherwise, the crawlers of these unimportant search engines will not only bring you no benefits, but will also kill your website. Here’s how to disable them: #get rid of the bad bot RewriteEngine on RewriteCond %{HTTP_USER_AGENT} ^BadBot RewriteRule ^(.*)$ http://go.away/ The above is to prohibit one type of crawler. If you want to prohibit multiple crawlers, you can configure it in .Htaccess like this: #get rid of bad bots RewriteEngine on RewriteCond %{HTTP_USER_AGENT} ^BadBot [OR] RewriteCond %{HTTP_USER_AGENT} ^EvilScraper [OR] RewriteCond %{HTTP_USER_AGENT} ^FakeUser RewriteRule ^(.*)$ http://go.away/ This code blocks three different crawlers at the same time. Note the “[OR]”. Use .Htaccess to disable hotlinking If your website is very popular, there will definitely be people who like the pictures, videos and other resources on your website. Some people will embed them directly into their pages without professional ethics, occupying or wasting your bandwidth and affecting the stability of your server. For such hotlinking behavior, it is easy to block their theft using .Htaccess, as shown below: RewriteEngine on RewriteCond %{HTTP_REFERER} ^http://.*somebadforum\.com [NC] RewriteRule .* - [F] After adding the above code to .Htaccess, when somebadforum.com hotlinks to your website resources, the server will return a 403 Forbidden error, and your bandwidth will no longer be lost. Here is how to block multiple sites: RewriteEngine on RewriteCond %{HTTP_REFERER} ^http://.*somebadforum\.com [NC,OR] RewriteCond %{HTTP_REFERER} ^http://.*example\.com [NC,OR] RewriteCond %{HTTP_REFERER} ^http://.*lastexample\.com [NC] RewriteRule .* - [F] As you can see, .htaccess is a very powerful web server configuration tool. Through it, you can have rich and free control over the web server. The solution is usually very simple and elegant, and basically does not require restarting the server, that is, it takes effect immediately. If you don't have this configuration file on your server, create one now! For more articles on using .Htaccess files to block malicious IP attacks on websites, please click on the relevant links below You may also be interested in:
|
<<: Detailed explanation of using pt-heartbeat to monitor MySQL replication delay
>>: How to implement a binary search tree using JavaScript
/***************** * proc file system************...
It is not possible to use width and height directl...
Hide version number The version number is not hid...
Table of contents 1. Preparation 2. Introduction ...
Students who make websites often find that some n...
Install Docker Update the yum package to the late...
Deployment environment: Installation version red ...
Introduction Part 1: Written at the beginning One...
The CSS position attribute specifies the element&...
Here are some problems encountered in the use of ...
Take 3 consecutive days as an example, using the ...
In development, it is often necessary to cache th...
Table of contents Purpose of Teleport How Telepor...
1. Modify MySQL login settings: # vim /etc/my.cnf...
Preface The server system environment is: CentOS ...