Because recently I did a big change of my web site, and it result to Google gave me lots of ‘robots.txt unreachable’ message in my Google Webmatser Tools. Maybe this is the reason why my site ranking went down?
Now I tried to create a ‘robots.txt’ in my root directory to block google crawl those already not-existed-anymore urls/pages and this is process I tried to solve this problem, hope it will work:
More information about sitemaps creating in here:
We sill see…
And solve this issue can raise the ranking for google which means, when people search related keywords, it has higher chance to reach your site. In my case, if I type in ‘Lens 2.0’ in google search engine, my site was the #1 on the list. But when I changed all my web site to a brand new site which is current content, google gave me hundreds of ‘unreachable’ errors, this cause my search result is in hundred pages away!
Now I am using this way to solve this issue, it seems work now! Awesome, now I am back to #1 again!
Mr.M 發佈於 1月29日2010年