Internet Marketing Ninjas Blog

Damned to Google Hell – Supplemental Results

Google Supplemental Results


They’ve been called "the Kiss of Death", "Google Hell", and "Screwed Pages" (Ok, I just made those up), but in any case, Supplemental Results is not where you want your pages to be if you expect traffic from Google.

Let’s see google’s official defination from their webmaster page:

Supplemental sites are part of Google’s auxiliary index. We’re able to place fewer restraints on sites that we crawl for this supplemental index than we do on sites that are crawled for our main index. For example, the number of parameters in a URL might exclude a site from being crawled for inclusion in our main index; however, it could still be crawled and added to our supplemental index.

The index in which a site is included is completely automated; there’s no way for you to select or change the index in which your site appears. Please be assured that the index in which a site is included does not affect its PageRank.

yada yada…."auxilary index"? "fewer restraints"? huh??

Here’s the stuff they aren’t telling you:
Pages from the "regular" index will almost always show up first for any searches. The only time you’ll usually see "supplimental results" is if there’s not many, or any, results in the regular index.  What this means, is that if you’re page about "blue widgets" is in the "Supplemental Results" then you’re screwed as far as having your page rank at all (will not show up at all since there’s pleanty of results for "blue widgets" in a google search. Your only chance of rankings a page that’s in the supplimental results is if someone searched for something super specific like "blue widgets in southbend kansas on market street").

Supplemental Results also tend to have old Google Caches… other words, once google has sent them to "Google Hell", they tend not to come back….thus you’ll find pages in the Supplemental Results are dated long ago.

Why does google put pages in the Supplemental Results?
I’ve been able to identify 3 main reasons:
1. Duplicate Content – take someone elses content, get sent to Google Hell (Supplemental Results)
2. No Content – create pages with no content (remember the days of directories that would create 1 million pages with only 100 listing?) – empty pages get sent to Google Hell.
3. Orphaned web pages. Pages that no one links to, including yourself.

Now, for the dirt – how to get out.
1. If you stole content – change it.
2. If there’s no content – add some.
3. If it’s orphaned – link to it.

Now, that’s not eneough…remember, once page are sent to the Supplemental Results, they tend not to get revisited. Once you’ve done your changes, submit a google site map and cross your finders.

If that doesn’t work, try publishing them on new URLs.

Additional help: Barry also points to steveb at webmasterworld who has some advice.

OK…any have other ideas on why a page could be put in the Supplemental Results?
Anyone have more ideas on how to get pages out?


46 Responses

  1. “SessionID” URL, redirect URI to dynamically-generated page that time-out, Request-URI Too Long, “&ID=” and/or “?” used in URIs and parser error, e.g., HTML head element not closed; or, body not opened.

    I’ve witnessed all of these (as well as those you’ve noted above) on a shoddy source coded CGI site that wonders why their 2,000 pages are not working in Google; they’ve got ten (10) pages listed!

    This CGI site fares much better on Yahoo; they’ve got 1,200 listed.

  2. Another reason: recent reinclusion. I recently (well, two months ago) submitted a reinclusion request for a domain that had been banned for years. A week or so ago, it began appearing in the supplemental results. Now, it’s out of the supplemental index and in the main index, although still not ranking where it should be.

  3. SteveB – funny you should mention him.

    I just love that SteveB. One of my pastimes is reading his comments at WMW, as well as other forums. Some people read the funnies, I read SteveB.

    He has this way of words – he serves them blood-rare and chunky, and a lot of folk find it hard to swallow ’em.

    He’s a ace SEO, and has straight-forward, logical thinking on the topic which he shares with others..

  4. hey jim,

    great seeing you at WMW & SES. And great post!

    One of my sites has been sent to supplemental hell. Our content is original, and most pages not orphaned. is there any other reason G would do this to us? problem with the code perhaps?


  5. what’s the url? the url you gave in your name link seems to be fine (all those pages seem to be indexed fine (not supped)).

  6. what’s the url? the url you gave in your name link seems to be fine (all those pages seem to be indexed fine (not supped)).

  7. thanks jim,

    destination google, type in

    only 4 pages come up, the other 21,000+ google is hiding. any tips?

  8. 2 tips.

    1. Click on “repeat the search with the omitted results included.” to see the rest of your pages – they are indexed.

    2. change your title tags of those pages – google thinks they’re all similiar since they have the same exact title tag.

  9. thanks jim.

    within the mix of omitted results, there are many different pages with title tags. why do they get sent to omitted results?

  10. they’re just omitting them from the search for … there it shows you 4 or 5 pages…. this does not mean that the other pages are really “ommited” from Google’s results…they are there and they count…it’s just saying that for a search of “” – “here’s some main pages”…that’s all…your other pages are not “discounted” in any way…they’re fine. You’ve got no problem.

    My only advice is to try to change your title tags (since most are the exact same)…but this is not causing a “problem” with google. They’re indexed, they’re fine, for other searches those pages will show up. These pages are NOT in the supplimental Hell – you’re fine.

  11. Jim,

    I’ve got a site that’s been in Supplemental Hell for almost a year and I just don’t understand why. I’ve added a good amount of content over the past few months but still “no respect” from the search engines, especially Google. Any suggestions?


  12. The old version of a page will be in the Supp Results, and will appear for any searches made for OLD content. The page will rank as a normal page for the current content (with a current snippet and cache).

    If you search for stuff that is no longer on the page, the same URL will be returned in the SERPs, but will show an old snippet – a snippet for the content that USED to be on the page. The cache may be an old one, or it may be the brand new one with NONE of that content in it.

    The snippets do NOT come from the cachd page. They are stored somewhere else..

  13. Hello Jim,

    I have a doubt of this 3 point theory. If you can visit my site and will see suplimental reasults. You will find pages like Testimonials have inbound links from many pages including Home page and do not have copied data, then also it is marked as Supplimental by google.. Any help in understanding????


  14. Amit,
    when I looked at the cache of your homepage (looks like you just changed it), there was no links going to your testamonial page…now there is…but there wasn’t…
    in fact, yahoo only shows 2 internal backlinks to that page…your homepage, and the whycis page…and when I check the google cache of both those page, there wasn’t a link going to your testamonial page prior (link to your testimonials page is missing from google cache) …that’s why it’s supp’d by goog…it was an abanded page…your own site wasn’t even linking to that page….that’ll do it.

  15. Hi Jim,

    We launched our site ( back in December 2006 and at first did very well on Google SERPs but we’re now in Supplemental Hell (over 30,000 pages in supplemental vs. 1,000 in the main index). When we launched we had just over 4,000 products and that has steadily increased to over 25,000.

    We’re trying to create deep links to counter the effect and we’ve excluded any duplicate and noise pages but I think we face a problem that will be difficult to resolve. The content on each product page is provided to us by Publishers and the same content is provided to many other resellers and we think Google is treating our content as duplicate.

    Is there anything else we can do?

    Any comments would be useful.


  16. Hey Jim,

    Was wondering if you could check out our nz online business directory: We have 26,200 pages indexed and only 10 of them are in the main index. We’ve placed robots.txt on the pages we don’t want indexed, yet some of those are included in the main index. We’ve also got links going to all the pages in the supplemental index and they all have original content with most of them updated regularly. So, why are we being banished to supplemental hell. Just the other day another one of our pages got dropped into outer darkness. What can we do?

  17. Hi everyone,

    The possible causes of the supplemental index are.

    1. Duplicate content.
    2. Same Title, Description to all pages
    3. Orphan pages (no incoming links for that pages from the same site and other sites.
    4. 301 Redirect www VS non www version of the site.

    B. Bashir

  18. Hello Buddies!
    I’m SEO Expert and dealt many times with google supplemtary results. Yes you are right that when it tends to move out to google hell section, they never comes out. I’ve found a little trick to reduce such effects from your website and how to come out from these bad resluts.

    One you can resubmit your website for indexing.
    Submit your site map (ROR, XML) to Google sitemap section.
    Reshuffle your external links and check them carefull, might they are comming from a prohibited website, this could also casued to be supplemetal.

    First try them all and if they doen’t work at all, I’ll come again.


  19. Our site was also recently sentenced to supplemental…no idea why, any help will be appreciated.

  20. Well there could be many problems, if the site pages goes into supplementary results. As I’ve mentioned it before that proper sitemap, link building can take your site landing pages out of supplementary results. You have to try first to create your website sitemap, and then submit it on google sitemap. Within few days your supplemental pages will come out and stand better than they are.



  21. I think good backlinks play a major role in keeping a site out of supplemental results…this applies to inner pages too…and yes no duplicate content…if there is then God Bless…

  22. Yes you are right..and I think I explained this query before. It is also very important that the link coming from trsted sites also. Trusted I mean the sites which has good Google PR and age, too.



  23. I don’t think duplicate content has much to do with a page going supplemental…I think its all in the links pointing to that page.

    If you have copied content and you can get some strong links pointing to that page…it will come out of supplemental..

    most of wikipedia’s content is copied from other places and they are not hit hard with supplemental.

  24. Well I’m not agree with you, wikipedia is a social network that represents different communities. Most the definiations, articles, blogs, reviews etc have been taken from different places but can be recognized on a signle platform i.e. wikipedia. Wikipedia gathered the information from different websites relating to news, entertainment, sports, history, etc.. but that doesn’t qualify to measure the duplicate contents. May be you write an article which already writen by someone with the same title, but the thing is that whether the content is the same or not. If the content revels the same description and definition, it would be in supplemtary results.



  25. Another reason of supplemntary results is bad neighbourhood. Becuase link coming to you should be strong and trusted by all means. Search Engines also check links coming to your website and entertain them by giving their trust. I’ve launched my personal training web site last month. Most of the pages certainly got supplementary index. I soon reaslised that something going wrong while creating and developing the contents. Even the contents are rich and well dressed. Then I resubmit the sitemap to Gooogle, Yahoo, AOL, for reindexing. I’ve got better results and now my website gaining traffic as well as most the pages are indexed in Google cache.


    Bilal (SEO Expert)

  26. I would like to validate somee of this from mmy own expereinces:

    1. No links pointing the page = TRUE

    2. Duplicate titles = SOMETIMES may or may not cause problems, but most likely yes. Follow good SE standards to avoid this problem.

    3. Not enough content = TRUE

    4. Duplicate content = UNSURE. You may notice that is a clone of wikipedia, but I did not notice that was sent to supplemental (their template is different content, but bulk is identical). I found some supplemental results for obscure articles which were given the following meta description “Shinmeiryuu There are very few or no other articles that link to this one . Please help introduce links in articles on related topics”

  27. I think quality links can help you to avoid supplementary results. I’ve found my 25 supplementary pages, when I started my website. But Cool linking and quality inbounds got them out of that hell area, and now they are working fine. So the quality links does matter in the way you are going to get out of supplementary hell area of Google.


    Bilal (SEO Expert)

  28. now they are speaking of leaving the hack out. but still I have some supplemental results for my website. The Google Sitemap doesnt work either.. The weird thing about is, that the best individual content that can be found on my website at (the articles) has gone supp. G needs to clean up his act. I think thats why Matt doesnt favour supp as well..

  29. Hello Bilal,

    I noticed that the category name has been eliminated. But nothings changed really. Do you reckon all the supp results will be spidered more ofte?: whats the actual change for our missing supp results?

    Simon (sem copywriter)

  30. I have read somewhere that the supplemental results are not at all bad. But still there exists a large group which thinks that supplemental result is an outcome of bad search engine optimization or I will say lay optimization. Which group should I believe?
    Has anybody got suffered due to supplemental results?

  31. Well Mandar I’d say that you are right, but right now google has just eliminated its supplementary category from its part. I’m the one who have been cheered this and got to know when I posted this on my website that “Google has eliminated its supplementary Category” I’ve recieved lot of emails regarding whether it is still there (supplementary index) or not. and my answer would probably is not. Now I’m going to give you the answer of your question. If we go back then supplementary results pages always hurt the page rank, its popularity and having overall bad affect. Supplementary pages never goes in a good favour of any website owner. Who says that supplementary results got nothing to do with the website. It must hurt the ranking of the website. But we never ingonre the damping factors which can cause supplementary results, i.e. never stolen the content, nor copied, unique page title, linking building solid, quality link gathering. thats all.. if you still want to know about google supplementary results, then you have to visit read my blog post at
    Hope the information will be enough to understand whether the supplementary area is existing or not.


    Bilal Qayyum (SEO Expert)

  32. We have recently (3 months ago) updated our site and have seen pages begin to go into supplemental. Although Google is not showing them as such, over 90% are there. Even new pages after their intial scan are either supplemental or not indexed at all. Any suggestions?

  33. Hello

    As far as supplementary results are concerned, they are no more longer exsist. Well I may suggest you to submit your google sitemap again and wait for atleast 3 days to make sure your pages will become indexed.

  34. I am running an hotel directory in three different languages. Because Google only indexed English pages and these pages also showed up (but rarely on first page in SERP’s ) in the Dutch results, I decided to create sub domains for the different languages, hoping that Google would index all languages and thus improve the position.
    However, since I have done this change, the number of indexed pages has dropped (as did the number of visitors unfortunately).
    Could this be a penalty for duplicate content (despite the fact of the translations)? And if so, what would be the best way to have all translated content into the indexes without punishment?

    Second question: I created an XML sitemap with subsitemaps for the hotels per country. The largest sitemap (+ 21,000 links) doesn’t get indexed by Google but neither they give any errors on the sitemap. How is this possible? (sitemaps for other countries are used normally for crawling by Google).

  35. First of all I’d like to give answer of your second question. Yes it is true that XML sitemaps are added to indicate crawler to visit the pages over the site nodes. But as per you, the sitemaps are submitted locally. if you are talking about country sitemaps, there is another procedure of positing and helding them within the regions. The same as you did with general sitemaps. Now I’m going to give answer of your first questions. I want to know have you used translator in the site or not? then i may be able to say something…


    Bilal SEO Expert

  36. Hi Bilal,
    Thanks for looking into this.
    About the sitemaps: Google says that 50,000 links is a maximum for a sitemap. As I have approx. 47,000 links right now, I thought it would be better to split the sitemap and a logical and easy way was to split per country. As a result I have xml files with 1,000 links but also with 21,000 links. In the past I had put everything into 1 xml sitemap (and the database query to build the sitemap ordered by country) and noticed that Google only indexed the first 10,000 links or so. Serving Google multiple files should cause Google to index at least a part of every file, I hoped. Apparently it doesn’t make a difference.
    On every subdomain I have placed the same sitemaps but pointing to the correct subdomain.
    Regarding the first issue: translations. I have not used Translator. I have built the application myself and I just use a different set of variables for the different languages (stored in the database). It works well and Google has started to index some pages in Dutch, but on the other hand, they also started to drop pages in English (more pages dropped than added). I have not changed any other SEO factor on my site so the only reason why this is happening seems to be the subdomains.

  37. Thanks for the details.

    Well I like your idea of separating sitemaps to target different regions. It is better to keep the targeted domains for a particular region. Like you can create a separate xml file for the region Australia. This could also enhance search engine visibility for a particular region. Similarly you can create a separte country map and place it for Denmark and use a separtator (variable) for a particular region, and also mention this in your .htaccess file. Besides, use of translator will enhance your search engine visibility more than you have had.


    SEO Expert

  38. Hello… Looks like this thread may have run out of steam a while back, but, I was just wondering if someone could help me understand my supplemental index problem. On my site,, it appears as though I have only 7 pages indexed in the main index, the remaining 980 supplemental. Now, we just went through a major redesign, creating original, unique content, cross linking strategies, etc…. Just wondering if anyone could give me advice on why Google doesn’t see the majority of our site as relevant to the main index. Thanks for the help…

Comments are closed.


Meet The Bloggers

Jim Boykin
Jim Boykin

Founder and CEO

Ann Smarty
Ann Smarty

Community & Branding Manager