<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>
	Comments on: Blocking Baidu and Yandex Search Spiders	</title>
	<atom:link href="https://sonet.digital/blog/seo/blocking-spiders/feed/" rel="self" type="application/rss+xml" />
	<link>https://sonet.digital/blog/seo/blocking-spiders/</link>
	<description></description>
	<lastBuildDate>Mon, 18 Sep 2023 15:32:04 +0000</lastBuildDate>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>
		By: CathyL		</title>
		<link>https://sonet.digital/blog/seo/blocking-spiders/#comment-76</link>

		<dc:creator><![CDATA[CathyL]]></dc:creator>
		<pubDate>Mon, 18 Feb 2013 10:45:41 +0000</pubDate>
		<guid isPermaLink="false">http://www.southbourne.com/blog/?p=363#comment-76</guid>

					<description><![CDATA[It&#039;s working for us and has reduced bandwidth usage. We do not mind blocking spiders from China or Russia as were only selling into the UK.
 
Thanks x]]></description>
			<content:encoded><![CDATA[<p>It&#8217;s working for us and has reduced bandwidth usage. We do not mind blocking spiders from China or Russia as were only selling into the UK.<br />
 <br />
Thanks x</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Paul		</title>
		<link>https://sonet.digital/blog/seo/blocking-spiders/#comment-75</link>

		<dc:creator><![CDATA[Paul]]></dc:creator>
		<pubDate>Mon, 18 Feb 2013 04:36:36 +0000</pubDate>
		<guid isPermaLink="false">http://www.southbourne.com/blog/?p=363#comment-75</guid>

					<description><![CDATA[Hi Daniel, there&#039;s more than one way to send the bots away and either method should work just fine.
You can even catch the request in your CMS and issue 403 messages to the bot, you&#039;ll still see the request to the site in that case but you won&#039;t be serving up any data, therefore if it&#039;s done right there&#039;s minimal load on the server.]]></description>
			<content:encoded><![CDATA[<p>Hi Daniel, there&#8217;s more than one way to send the bots away and either method should work just fine.<br />
You can even catch the request in your CMS and issue 403 messages to the bot, you&#8217;ll still see the request to the site in that case but you won&#8217;t be serving up any data, therefore if it&#8217;s done right there&#8217;s minimal load on the server.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Daniel		</title>
		<link>https://sonet.digital/blog/seo/blocking-spiders/#comment-74</link>

		<dc:creator><![CDATA[Daniel]]></dc:creator>
		<pubDate>Tue, 12 Feb 2013 14:42:22 +0000</pubDate>
		<guid isPermaLink="false">http://www.southbourne.com/blog/?p=363#comment-74</guid>

					<description><![CDATA[hy vicent. Sry for my english
i have this rules in my htaccess. is ok like that or.. is better in your way ? I don&#039;t know the difference.  Some use this, some use like u with SetEnvIfNoCase how is better?
RewriteCond %{HTTP_USER_AGENT} ^Baidu [OR]
RewriteCond %{HTTP_USER_AGENT} ^Yandex [OR]
RewriteCond %{HTTP_USER_AGENT} ^Sosospider [OR]
RewriteRule ^.* - [F,L]]]></description>
			<content:encoded><![CDATA[<p>hy vicent. Sry for my english<br />
i have this rules in my htaccess. is ok like that or.. is better in your way ? I don&#8217;t know the difference.  Some use this, some use like u with SetEnvIfNoCase how is better?<br />
RewriteCond %{HTTP_USER_AGENT} ^Baidu [OR]<br />
RewriteCond %{HTTP_USER_AGENT} ^Yandex [OR]<br />
RewriteCond %{HTTP_USER_AGENT} ^Sosospider [OR]<br />
RewriteRule ^.* &#8211; [F,L]</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Jim		</title>
		<link>https://sonet.digital/blog/seo/blocking-spiders/#comment-73</link>

		<dc:creator><![CDATA[Jim]]></dc:creator>
		<pubDate>Sun, 10 Feb 2013 18:06:46 +0000</pubDate>
		<guid isPermaLink="false">http://www.southbourne.com/blog/?p=363#comment-73</guid>

					<description><![CDATA[I was hoping this was going to resolve my issue with these bots and their CPU drain but alas your fix at least for me has not been successful in blocking them. :( I have added this code to five off my domains and they are still getting bombarded. Total Bummer...]]></description>
			<content:encoded><![CDATA[<p>I was hoping this was going to resolve my issue with these bots and their CPU drain but alas your fix at least for me has not been successful in blocking them. 🙁 I have added this code to five off my domains and they are still getting bombarded. Total Bummer&#8230;</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Vincent		</title>
		<link>https://sonet.digital/blog/seo/blocking-spiders/#comment-72</link>

		<dc:creator><![CDATA[Vincent]]></dc:creator>
		<pubDate>Tue, 29 Jan 2013 11:29:21 +0000</pubDate>
		<guid isPermaLink="false">http://www.southbourne.com/blog/?p=363#comment-72</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://sonet.digital/blog/seo/blocking-spiders/#comment-71&quot;&gt;Phil&lt;/a&gt;.

Hi Phil,
Thanks for the response. It has been almost six months since we implemented it onto a couple of e-commerce sites we manage. So far so good. Your response prompted me to have a look through the stats and I found no incidents of any of the above bots. However, I have updated the post to include the &#039;Sosospider&#039; another resource draining spider / bot.
Let us know how you get along.
]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://sonet.digital/blog/seo/blocking-spiders/#comment-71">Phil</a>.</p>
<p>Hi Phil,<br />
Thanks for the response. It has been almost six months since we implemented it onto a couple of e-commerce sites we manage. So far so good. Your response prompted me to have a look through the stats and I found no incidents of any of the above bots. However, I have updated the post to include the &#8216;Sosospider&#8217; another resource draining spider / bot.<br />
Let us know how you get along.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Phil		</title>
		<link>https://sonet.digital/blog/seo/blocking-spiders/#comment-71</link>

		<dc:creator><![CDATA[Phil]]></dc:creator>
		<pubDate>Mon, 28 Jan 2013 21:31:37 +0000</pubDate>
		<guid isPermaLink="false">http://www.southbourne.com/blog/?p=363#comment-71</guid>

					<description><![CDATA[Hey Vincent,

I came here because Baidu and Yandex bots are the main drain on my bandwidth for all the sites I administer.  I actually wrote to baidu&#039;s support department to beg them to stop hitting my website and for a while they did, but of course they came back.

Have you found the above fix to work without side effects since you&#039;ve implemented it, because if so this will be awesome news!

Phil]]></description>
			<content:encoded><![CDATA[<p>Hey Vincent,</p>
<p>I came here because Baidu and Yandex bots are the main drain on my bandwidth for all the sites I administer.  I actually wrote to baidu&#8217;s support department to beg them to stop hitting my website and for a while they did, but of course they came back.</p>
<p>Have you found the above fix to work without side effects since you&#8217;ve implemented it, because if so this will be awesome news!</p>
<p>Phil</p>
]]></content:encoded>
		
			</item>
	</channel>
</rss>
