SEO Blog Logs

Help the Kids



What is SEO (Search Engine Optimization)?
SEO is improving the volume and quality of traffic to a web site or page from the search engines using natural organic and algorithmic search results (SERPs).

SEO is actually a component of SEM (Search Engine Marketing). Which encompasses many avenues with which to promote a web site or page.
If you have come here looking for SEO help, you have found a blog that can help you. There is much information here to be held and please feel free to comment anytime...Even with a question.
Showing posts with label index. Show all posts
Showing posts with label index. Show all posts

August 30, 2007

Suplemental Index | Developing a New Plan


Many of you may have noticed that supplemental operators aren't working. Well, I found one last week that was working, and has since stopped. They have been on again off again for some time. Google reported back on July 31st, 2007 that they were seeing a more narrow distinction between the supplemental and regular index. That supplemental pages were "fresher and more comprehensive than ever". At this point I think they have fully executed their plan to stop labeling supplemental results.

Here's my problem....Google claims to be placing fewer restrictions on sites, by as they say "indexing URLs with more parameters". However, one of the major mistakes that lands a page in the supplemental is duplication, and that is NOT addressed in this move. Many seasoned webmasters full know how to determine a probable supplemental page. The thing is, not all webmasters, have the knowledge, tools, or experience to do this. I think Google makes good effort in other areas, and before they removed this tool, they might have considered a addition to Google Webmaster Tools for duplication. I know what many are thinking reading this....who cares about the newbs. Well, everyone is a newb at one point. Since the search engines change the criteria so often, it is only logical that they supply, at least verified site owners, the tools to be successful in their index. Not doing so can breed even more spam and black hat techniques. Most people when faced with a dilemma will look to the most logical place for an answer, then most will continue to other resources for a fix...After many will give up or cheat.

I also think part of the reason Google's supplemental index has grown to be fresher and better is due to the serious attention it has gotten in the last year. Webmasters were easily able to locate information on how to check their supplementals, reasons for supplemental, and how they might fix these pages. Now, that will all be a little greyer. OK, so you might be wondering, how you might know a page is likely to be supplemental. Well you will rely on your stats and some search rank comparison for this. The page might be coming up short on traffic, you will have to weigh in what you suspect to be that page's traffic power compared to the other site's pages. You might find that page's Google traffic is a much lower or zero percent as opposed to your other pages. Yahoo traffic for that page, for example, might be OK...But Google is practically non-existent. If you have server stats, you might notice that Google doesn't crawl that page very often...For example every 3 or 4 months. Something else to consider is if all of your Google traffic is failing to perform and you have decent traffic from the other engines, you could have too many supplemental pages and Google has assigned your site low authority/credibility as a result. Lack of reasonably frequent crawls to your site would be a big indicator of this. There are some things you can do to pull it out, it takes time, planning and work.

SERPs will be a verification tool for a page you have determined to be probably supplemental. If the page was ranking well for specific keywords or phrases, and now it's not, that will signify a problem. Factor in any recent Google updates. Realize that your feeds, html sitemap pages (with no content), contact pages, and other minimally important pages to you...Google probably sees them the same way. So for example, you have 2 pages; one for product A, and one for product B. Product A is ranking well, good Google traffic. Product B is not. You have already determined that product B is not getting crawled much, and is a probable supplemental. I suggest this....Run a keyword density scan of the page and search Google for the most dense keyword phrases you can find from the page. Make sure you turn off Google customized search, or use a tool like this. Check some different data centers also, so you have a good picture. The reason I say this, is I have come across many customers and webmasters alike that have no idea what terms their site should be ranking for, their page is all about bicycle parts, and they are hoping to rank for Mountain biking. Google will only use the contextual information in the page, anchor text from backlinks, Meta title & maybe description, possibly "alt" tag text, and Domain/URL name to index you for queried keywords. I have heard ramblings about the text around your page's link on another site, but I do not believe it to be true, as it would certainly have some terrible black hat repercussions.

For folks who do not have server side access, this will be difficult. I use Analytics, as many other webmasters do. Maybe I am just to confused by the new layout, but I cannot find a clear path from my Google traffic to each page. So I offer a challenge, does some one have a free program/script that will supply referrer visits to individual pages, that does NOT have to be installed server side. Or can someone figure or already know how to get this information out of Anayltics. I would gladly publish an article written by this person, or link to such a program/script or article. Don't send any spam...I won't do it. I don't think the folks on free hosting or cheat hosting just trying to get started should be excluded from the ability to be marginally successful as they build their skills. I guess I am just a "blue collar" webmaster.

Peace and SEO

Melanie Prough
"Baby"

DIY Your SEO With The SEOCog
Digg This Post We Require a Link Back to SEOCog.com Please.
**We Require a Link Back Please.
Finish Reading!

August 21, 2007

Search Engine Index SEO Experiment | Top 5 Position Analysis


Good morning this is the final part of user statistics and demographics for the top 4 engines. Excluding AOL as it is powered with Google results. Day 1: MSN / Live; Day 2: Yahoo; Day 3: Ask.com; Day 4: Google.Today we will explore the SEO metrics of the sites that these engines are indexing the the average page 1 top 5.

I did some research and found some very interesting, and unfortunately disappointing results from these 4 engines. I suspect you will be disappointed as well. I conducted my own little search experiment to explore the SEO use and effectiveness in the average top 5 search result position of these engines. Please, read the details and definitions of the experiment before we get started.

So now that you have an idea what I am talking about, lets get started. The engine with the overall freshest data was MSN / Live with and average cache elapse of just 6 days and a range of 1-14. Google, who I expected to clinch this metric, came in at 8 days. Ask.com has a terrible "freshness" problem, their average elapsed cache was 31 days. They ranged from 21-85 days. Does this sound like a good thing to you?

I am so seriously disappointed in the load times of the experiment. Believe it or not there was 2 pages that loaded in over 373 seconds on a 28.8 measurement. The relative standard is 30 seconds. Its not written in stone, but everyone should be looking a speed. Users have the entire Internet at their disposal, why should they wait that long? Answer, they don't! Msn / Live had the fastest pages with an average speed of 41.32 seconds @ 28.8 kbs. The worst was Google, clocking in at 74.35 kbs.

I checked out basic Meta tags for a few reasons. I wanted to see who was using keyword tags, and whose were badly misused. I saw some stuffing, but not much. There were some tag violation however, I actually saw a keyword tag that was 598 characters! MSN / Live had the least Meta violations, while Google had indexed the most in the group.

I checked the markup of the pages in general for warnings and errors. The engine with the cleanest code in the results was Yahoo, who average an astounding .30 errors over 20 results. Google was a pretty close second at .45. I was very impressed with the low amount of page errors. However, Ask.com, had 17 times more errors than Yahoo!

I checked server headers for a "last modified" date. I really think if Google is going to start displaying "fresh"results, then having a modified response in the header is a necessary element to use. I found and extremely low amount of headers formatted to respond with the "last modified" date. There were only 22% of the sites I checked are using them. The oldest header response I found in the experiment was 11/2002 in MSN / Live. Pretty sad...

I checked link popularity to get a general feel for the listed sites linking power and prowess. I found both ends of the spectrum on all engines, in all positions. However, Ask.com nailed this one with an overall average link popularity score of 1,207,648.

I added PageRank, as not to disappoint anyone. I am not entirely convinced that it benefits a web site at all. I optimize for SERPs. However, you will be interested to know that I saw many zero PageRanked sites in the top 5...Even #1. Ask.com came in high on this metric also, averaging a PR 5 for all 20 pages. This was a close race though.

I optimize - optimise for long tail terms quite a bit, and it never fails to amaze me what people search for. So I checked these engines to see how they fared in returning these elusive long tails. Google drove this one home, scoring 95% of the pages returned with all of the queried terms. MSN / Live did a nice job also coming in at 85%. Dragging up the rear was Yahoo and Ask.com at 75%. I think overall they all did better that I had expected.

I checked each page for a clear "page updated" date, or a news / post that had a date. I found that the sites were even less likely to let users know when the content was updated. Google results fared the best here by a long margin, with 11 of 20 pages.

Total SEO score, was a pretty tight competition. Each engine supplied it's own merits to make up for shortcomings and still bring in a decent score. MSN / Live was the top dog with a total SEO score of 7.73/10. While Google came is last at 7.25, so it was a pretty close margin.

I used an old restaurant manager's scoring system to rate the overall results. A first place was worth 4 points, a second 3, a third 2, and a fourth 1. The overall possible points for the 13 metrics used was 52 for all except Yahoo, who I reduced to 51 as they had no cache date. MSN / Live came out clearly on top with a score of 71.2%. Yahoo was next with 66.7%. Google was 3rd scoring 61.5%, and Ask.com was last at 57.7%.

I really saw some crazy stuff. Cookies in server headers that has expired in 1981. I saw a great deal of webmasters who had not updated their copyright footers for as many as 8 years. I also determined that these 4 engines do not pull a whole lot of the same results. I thought I would see more of that. In the results used, there was generally only 1 or 2 results that were the same between any 2 engines. I think I learned a lot, maybe some stuff I didn't want to know. I am feeling a little discouraged that so many pages were in poor disrepair and not maintained. I wish Google would hurry that "fresh" notation to my data center. If you didn't view the data charts at the bottom of the details page, check them out.

I hope you gained something from this, I know I did. If you have any questions, please just ask. I would be interested in hearing any experiences you have with these metrics in relation to indexing and position also.

Peace and SEO

Melanie Prough
"Baby"

DIY Your SEO With The SEOCog Digg This Post We Require a Link Back to SEOCog.com Please.
**We Require a Link Back Please. Finish Reading!

August 3, 2007

How to Push, Pull, and Drag Pages out of Google's Supplemental Index


Most webmasters have at least one site plagued with supplemental issues, I do also. Google got rid of the operators to search for the supplemental pages. Then recently the took their notation out of the site operator search. Google says "Hey don't worry about your supplemental pages, we're crawling that too". I will tell you this today, me to you, that's complete BS.....I have a site ranked #1 for its prime keywords on all 3 engines, but has over 200 supplemental pages. Those pages are linked, fixed, updated, unique and SEO sound.........for over 2 months and they remain in the supplemental index. So I set out to find out what others have done to get out!

I read, and I read......same things...page after page and Blog after Blog. Why is Google making it so hard for us to view our supplemental results? What is the problem, how can we fix what we cannot see? BTW see the Cog's Google operator guide for the new operator to find your supplemental pages, compliments of Webnauts. What is the problem? I even emailed Google, said why are my totally redesigned pages still in your supplemental index? No answer 2 months now!

I suspect that a site worthy of page one listings for nearly all of its keywords on Google could get an dang crawl for its supplemental pages once in a while! These pages are being cached, just not crawled. So what do we do?

I have a theory, what if the pages were moved without redirects for the robots. I can redirect my backlinks and visitors, but not the spiders. So effectively forcing them to de-index the pages, but not remove them. After the pages are de-indexed, I quickly let the crawlers back into those pages and undo the redirects, back to normal. Now, I realize the pages will take a hit for probably 3 months, but hell they are not showing in SERPs anyhow! Make a new sitemap, resubmit it in Google tools and wait.

I really am at a loss...aside from trying something like this, my options are ignore the supplemental, or rename and relink the pages and give up the effective structure and solid links I have established. The site is not a big deal, its my personal site that I have toyed with from my early learning days....that's how it landed in the supplemental. I am troubled though that a site can be so effective in the SERPs, have high SEO scores, over 2600 backlinks in Google Tools 40% deep links, and be in this type of situation STILL!

So aside from practicing drastic theory like above the standard is:

  1. Redesign the pages where necessary, remove ALL spam. Unique content!
  2. Reevaluate Meta for uniqueness
  3. Take a good hard look at a spider's ability to get to that page with your current navigation. Get a sitemap!
  4. Get those pages linked to by good PR authority sites.
  5. Cross your fingers!

All joking aside, Google needs to fix this...pages should not just go there and die! I don't think they should use an extraordinary amount of resources to crawl it, but every 3 months is not what I had in mind either!

Feel free to comment as always, relevant comments will result in post edits with credit!

Peace and SEO

Melanie Prough
"Baby"

DIY Your SEO With The SEOCog
Digg This Post Bookmark This Page With the Service of Your Choice..Choices are Good! We Require a Link Back to SEOCog.com Please.
**We Require a Link Back Please.

Finish Reading!