Do you have the GSiteMap / XML Sitemap module installed? If you don't, I'd recommend it. Not only does it help with SEO immensely, but it may help convince Google that gee, those other files don't exist anymore, stop trying to find them.
I don't know for sure if Google will take the hint, but generally that module does help with SEO anyway and reduces server load (because Google knows where all of your important pages are) so I'd recommend it regardless.
On Thursday 30 August 2007, Tibor Liktor wrote:
Hi,
I've got a watchdog problem.
The watchdog log is essential for me to discover bugs and errors, and monitor the site's performance, and blahblablah - you know that.
But watchdog became quite useless for me, because it is full of 404 errors triggered by Google and other bots.
Now nearly 90% of the log is crap. It is impossible to dig out any useful info from that. (Not speaking about the additional server load and database size issues.)
Is there any solution to filter out the tons of messages caused by searchbots?
Do you face with similar issues? How do you handle those?
Best, Tibor
ps.:
Afais, it is might be caused by two reasons:
- I use GoogleAds
- the site had been dramatically changed, and bot searches for the 'old'
content
fixme.