De - Indexed by Google Recently?
There is a trend lately of pages and whole sites being de - indexed by Google recently. These sites and pages are new and old without prejudice. There is a good work up of the probable dump / update here. I just want to remind everyone, Google has been moving towards this for some time. They have told us on many occasions that they are moving to a fresher more unique index. I think from looking at the affected pages I found this is exactly the case.
First lets determine if you are a probable candidate for this issue. Check your Google Webmaster Tools message center to see if you have a message, although I suspect you do not. You want to go to Google search, your local Google if you like. Sign out! Execute a site:www.domain.com/ command and note the pages, this should ideally be your total indexed pages. Next, execute a site:www.domain.com/* this should be your pages in the main index...Note this also. Finally, there is much debate, but it appears to be functioning in my data center...At least my numbers are right. Execute a supplemental operator site command site:www.mysite.com/& and note these also. Now, how do these compare to your pre-August numbers and each other. You are likely to find discrepancy between the supplemental pages in the main index operator and the supplemental operator. Look at them and determine which is most likely to be most accurate in your data center. Lastly, if you have the ability....use this page flow tool to verify your findings.
Some, of the pages / sites appear to have been experiencing some new kind of sandbox effect...But totally de - indexed. Some, just lost pages that haven't returned. I think we should be looking for the normal culprit here...Spam. Spam can be many things to Google, but the 2 cases I found were duplication / not unique, and too many reciprocal links. I also read about one where the inbound links were accumulated to quickly, it was de - indexed, and then bounced back. So, lets concentrate on the basics of fixing these issues....Because even upon return to the index many of these pages appear to be supplemental.
Fist we will cover duplication. The number one rule is unique content, this absolutely includes different meta titles and descriptions for every page. As a matter of fact, if you have pages with small amounts of content, you meta can really make a difference. There is also the issue of canonical problems. This is a great tool to determine if you are having a duplication issue. you can read about how these server side redirects work here. If your site is already established then getting these pages out of the supplemental index is generally believed to require links to then, preferably quality links. Or, you might try blocking them in your robots.txt for a bit. If the site is brand new, the quickest way is probably just to rewrite the URLs and start fresh.
Reciprocal links are OK, in moderation. The site I looked at had over 30%, I suspect that is too much. My advice here, disconnect them. If you really have to have them for traffic, then use a "nofollow" on them. Try to keep your outgoing links to a percentage that is smaller than your own internal links, at least for the time being. Lastly, you might consider submitting a re-consideration request via your Google Webmaster Tools. Here is a great tutorial by Tim Nash to really help you with that. Ideally, even if this isn't really necessary you have done the CYA. If you don't read Tim's tutorial, make sure you include what you found and what you fixed.
Have you had pages de - indexed? Want to share? Comment or email me, anonnimity is OK too.
Peace and SEO
Melanie Prough
"Baby"
**We Require a Link Back Please.
0 Comments:
Post a Comment