How to Block Bad Bots in WordPress Without Plugins
Bad bots can slow down your site, steal content, and even compromise security. While plugins like Wordfence can help, you can effectively block malicious bots without plugins using server-level configurations. Here’s how:
Why Block Bad Bots Manually?
✅ Reduces server load (bots consume bandwidth)
✅ Improves site speed (fewer fake visits = faster performance)
✅ Enhances security (blocks scrapers, spam bots, and hackers)
✅ No plugin overhead (avoids PHP processing delays)
Method 1: Block Bots via .htaccess
(Apache)
Edit your .htaccess
file (located in WordPress root) and add:
# Block known bad bots RewriteEngine On RewriteCond %{HTTP_USER_AGENT} (AhrefsBot|SemrushBot|MJ12bot|DotBot|BLEXBot|rogerbot) [NC] RewriteRule .* - [F,L] # Block fake Google/Bing bots RewriteCond %{HTTP_USER_AGENT} ^.*(Googlebot|Bingbot).*$ [NC] RewriteCond %{REMOTE_ADDR} !^66\.249\. [OR] RewriteCond %{REMOTE_ADDR} !^157\.55\.39\. [OR] RewriteCond %{REMOTE_ADDR} !^207\.46\.13\. RewriteRule ^ - [F,L]
How it works:
- Blocks SEO scrapers (Ahrefs, SEMrush, MJ12bot)
- Stops fake Google/Bing bots (real ones have specific IPs)
- Returns 403 Forbidden for blocked bots. Our YouTube channel; https://www.youtube.com/@easythemestore
Method 2: Block Bots via nginx.conf
(Nginx)
If you use Nginx, add this to your server block:
# Block bad bots if ($http_user_agent ~* (AhrefsBot|SemrushBot|MJ12bot|DotBot|BLEXBot|rogerbot)) { return 403; } # Block fake Google/Bing bots if ($http_user_agent ~* (Googlebot|Bingbot)) { set $block_bot 0; if ($remote_addr !~* ^66\.249\.|^157\.55\.39\.|^207\.46\.13\.) { set $block_bot 1; } if ($block_bot = 1) { return 403; } }
Method 3: Block Bots via wp-config.php
(PHP-Level)
Add this to wp-config.php
(before /* That's all, stop editing! */
):
// Block bad bots via PHP if (isset($_SERVER['HTTP_USER_AGENT']) && preg_match('/AhrefsBot|SemrushBot|MJ12bot|DotBot|BLEXBot|rogerbot/i', $_SERVER['HTTP_USER_AGENT'])) { header('HTTP/1.0 403 Forbidden'); exit; }
Method 4: Block Bots via robots.txt
(Discourage Crawling)
Add this to your robots.txt
(in WordPress root):
User-agent: AhrefsBot Disallow: / User-agent: SemrushBot Disallow: / User-agent: MJ12bot Disallow: /
Note: This doesn’t block bots but tells well-behaved ones to stay away.
List of Bad Bots to Block
Bot Name | Type | Why Block? |
---|---|---|
AhrefsBot | SEO scraper | Steals backlink data |
SemrushBot | SEO scraper | Scrapes keywords |
MJ12bot | Malicious crawler | Aggressive scanning |
DotBot | Content scraper | Steals articles |
BLEXBot | Backlink checker | Slows down sites |
Rogerbot | SEO crawler | High server load |
Best Practices for Blocking Bots
✔ Update regularly – New bots emerge frequently
✔ Monitor logs (/var/log/nginx/access.log
or cPanel > Metrics > Visitors
)
✔ Whitelist good bots (Googlebot, Bingbot, Applebot)
✔ Use Cloudflare Firewall (Extra layer of protection)
🚨 Warning:
- Don’t block legit crawlers (Googlebot, Bingbot) – Verify IPs first!
- Test changes – Use User-Agent Switcher to check if blocking works
Final Verdict: Which Method is Best?
Method | Effectiveness | Difficulty | Best For |
---|---|---|---|
.htaccess | ⭐⭐⭐⭐⭐ | Easy | Apache servers |
Nginx config | ⭐⭐⭐⭐⭐ | Medium | Nginx users |
PHP (wp-config.php) | ⭐⭐⭐ | Easy | Quick fixes |
robots.txt | ⭐⭐ | Easiest | Non-aggressive blocking |
Recommended Approach:
- Start with
.htaccess
/nginx.conf
(most effective) - Add
robots.txt
rules (for compliant bots) - Monitor traffic & update blocked bots list
🚀 Pro Tip: Combine with Cloudflare Bot Fight Mode for extra security!
Need help? Drop your questions below! 👇