Google Seo Spider Search Engine Optimization?

Posted on 26 October 2010 by SEOman from Seattle

I want to create dozens of blogs. I cant possibly update them all. How do I keep the search engine spiders coming back? Is there a way to keep the spiders coming back and is it legal?

Originally posted 2010-09-17 12:17:48. Go Home page

Possibly Related Posts and Articles (automatically generated)

3 Comments For This Post

  1. johnec4 Says:

    as far as i know, the spiders come back frequently. sites like msn.com, cnn.com, etc that are updated throughout the day are spidered many times a day. as long as you have a spider in the first place, they will continue to come back. try and update frequently, and they will come more frequently.

  2. strayinma Says:

    Googlebot will continually revisit your site and crawl linked pages. You can help/improve this process by implementing the Google Sitemap, an XML file managed in your Webmaster toolset. Totally legal – it helps manage the process. It does not help your ranking, just getting crawled.

    Learn about the sitemap (and protocol) here:

    https://www.google.com/webmasters/tools/docs/en/protocol.html

    If you have a free Google Webmaster tools account, you can see when GGbot last visited your site and see where it tripped over your site while crawling (and what the roadblock is so that you can fix any site errors).
    You can also see what has/has not been indexed.

  3. Chris M Says:

    Not sure why you would want to create dozens of blogs if you don’t plan on updating them. If your blog is set up right, it will ping the search engines after you post so they get crawled quickly. If you don’t post often, you will not get crawled as frequently. If it is not set up correctly, you can always manually ping a blog at sites like http://www.pingmyblog.com

Leave a Reply