在Google網站管理員工具上出現太多’找不到’的報告(續)

views

2010-01-29

Because recently I did a big change of my web site, and it result to Google gave me lots of ‘robots.txt unreachable’ message in my Google Webmatser Tools. Maybe this is the reason why my site ranking went down?

Create a robots.txt

Now I tried to create a ‘robots.txt’ in my root directory to block google crawl those already not-existed-anymore urls/pages and this is process I tried to solve this problem, hope it will work:

  1. >Google Webmaster Tools
  2. >Site Configuration
  3. >Crawler access
  4. >Generate robots.txt
  5. >Upload to my root directory

Create a new sitemap.txt and resubmit

  1. >Create a sitemap.txt file
  2. >Copy paste all the url from my website
  3. >Save as an’utf-8′ format
  4. >Upload to my root directory
  5. Resubmit in my Google Webmatser Tools

More information about sitemaps creating in here:

http://www.google.com/support/webmasters/bin/answer.py?answer=156184&cbid=-1stoyvwxz352p&src=cb&lev=answer

We sill see…

2010-03-03 Yes, this can work!

And solve this issue can raise the ranking for google which means, when people search related keywords, it has higher chance to reach your site. In my case, if I type in ‘Lens 2.0’ in google search engine, my site was the #1 on the list. But when I changed all my web site to a brand new site which is current content, google gave me hundreds of ‘unreachable’ errors, this cause my search result is in hundred pages away!

Now I am using this way to solve this issue, it seems work now! Awesome, now I am back to #1 again!

Leave a Reply

Your email address will not be published. Required fields are marked *