Internet Marketing Ninjas Blog

Surviving Rankings, Re-Rankings, Filters and Google Hell.

I was reading Mr Wall today who had been reading this interesting thread on webmasteworld, and made these comments on his blog in a post about Different link having different goals.

…Two big things that are happening are more and more pages are getting thrown in Google’s supplemental results, and Google may be getting more aggressive with re-ranking results based on local inter-connectivity and other quality related criteria. …

After reading most of that Webmasterworld thread, I see the best meat of that thread, of course, comes from Tedster:

My current idea (this is used in many IR approaches) is that a preliminary set of results is returned, but then one or more factors undergo further testing. The preliminary set of results is now re-ranked according to multipliers determined in testing just those preliminary urls. These test factors could also be pre-scored and updated on a regular (but not necessarily date co-ordinated) basis, and be available in a secondary look-up table somewhere for quick use.

If your url doesn’t get into the preliminary set of urls, then this re-ranking step won’t ever help you — because no new results are "pulled in". If your url is in the preliminary set, the re-ranking may help you. But if you fail one of tests, then your relevance score, or your trust score, or your aging score, or your whatever score, can be multiplied by 0.2 or a fractional factor like that. that would send your url on a rankings nose dive.

So this type of re-ranking could account for the yo-yo behavior we see, going from page 1 to end-of-results and back again. Note that the url is not thrown out of the result set, the preliminary result set is kept intact, but just re-ranked.

Part of making re-ranking technology like this practical and scalable would be getting very quick preliminary results — often cached preliminary results, I assume. This need for speed might also account for the large numbers of urls being sent to the Supplemental Index, making for a less unwieldy primary index.

Supplemental urls would only be tapped if the total number of preliminary results fell below some threshhold or other.

This is my current line of thinking – purely theoretical, although informed by some knowledge of the art of Information Retrieval. I keep looking at the changes and asking myself what kind of math could account for the new signs we are seeing.

As long ago as 2 years, GoogleGuy mentioned in passing that we were tending to think in terms of filters and penalties, but that Google was moving away from that model. I think they’ve moved a giant step further — although some filters are clearly still there (only 2 results per domain for example) and some penalties as well (often manual).

I believe Tedster hit the nail on the head with some great points.

I’ve had this picture in my mind of a row of Google servers. data is gathered, and fed to the first computer that ranks the pages based on Pagerank, the next computer then recalculates the rankings based in TrustRank, the next computer reorders the listings based in interconnectivity of the community, the next computer reorders them based on Filters that are applied, then more filters and more reordering…then there’s different datacenters each with a slightly different weights on each prior reordering….and as more data is fed in, and more reorders and filters are applied the more things change….phew!

So with this picture in my head, like Aaron, I too see obtaining diferent links with different goals. The goals that I see working on are these:

  • I see getting trusted links from trusted sites to raise the trustrank value.
    (Find Trusted sites and write to them and offer them something of value)
  • I see getting links from subpages that have direct trusted backlinks to them to help trust and power.
    (Get links from pages that have backlinks to them…..they are worth sooooo much)
  • I see boosting my co-citation site neighborhood by getting links on pages that link to other sites in my neighborhood.
    (Common Backlinks (link to our public tool, sorry, only our private  tool strips the crap scraper results from this list)
  • I see boosting my co-citation page neighborhood by putting other trusted links next to my links (mixing neighborhood with trust).
    (here’s my paragraph ad with links to me and a few other highly trusted related sites)

I see trying to do all I can do so that when Google’s "done" ordering and reordering and filtering, refiltering, over an over again, that if I’m lucky our guys will come out on top. That’s the goal. The types of links above are what I’m focusing on for my strategy.

And yea…as far as Google Supplemental Hell goes….yea, Google’s cleaning house….Tedster also makes some great notes about that.

Speaking of Supplemental Hell, here’s something else that I’ve been experiencing…if you publish 300 pages today and 3 months goes by and no one links to any of these 300 pages…guess where they might be going? (Supplemental Hell?) – (Moral, don’t publish a bunch of pages in once (esp in a new folder) unless you know they’ll get some backlinks and trust to those pages within a few months…I’ve seen many sites have entire folders go to supplemental hell after people published hundreds of pages in that folder in 1 day and then nothing there got a link for the first 3 months of existance….and seen other new folders/pages survive (so far) that got a small handful of nice backlinks to even a few pages in new folders…..kinda says something about checking for "quality rating" of new pages…no links to any pages in a folder and supplemental hell for them all. (Moral, publish a new folder with 300 new pages, get some of those pages some trusted backlinks, and hurry!)

In other news: I’ll be away all next week for SES in NYC. I was only going to stay a few days, but I’ll now be speaking on the "Linking Strategies" panel on Friday at 9am so I’ll be staying the whole week. See ya in NYC!

Facebook
Twitter
LinkedIn

19 Responses

  1. Nice… A Pagerank assembly line.

    I’ve had a similar picture in my head ever since people started noticing the sandbox effect where a new website would rank well for a few days and then disappear for the next 18 months.

  2. I agree, and it explains why having a network of quality sites is more important than ever. If you can have 5-10 high quality sites that provide backlinks to your new sites they seem to stay viable in the rankings and not get sandboxed.

    Then they will be in a position to get organic ranking much easier.

  3. As for “going supplemental” if you follow Matt Cutts he has clearly stated that sites go supplemental simply because they do not have enough pagerank. I have also see sites with bad structure go supplemental because they do not divert PR correctly.

    You got to take everything said in the past with a grain of salt, this stuff changes by the minute.

    *runs off to look if high PR pages go supplemental*

  4. Jim, that theory would indeed make sense. In fact it would make more sense that G penalizing a site. If a site fails one of the tests then it obviously wouldn’t be relevant…

  5. >Aaron, I’ve seen folders of webpages that were PR6’s…and then tossed to Hell.

    Same here with sites where these pages have had stable high rankings for years.

    Add one more server to the equation that factors in local search data.

    Time for a few micro sites instead? Requires more link building effort, but easier to reduce the filters.

  6. I agree, and it explains why having a network of quality sites is more important than ever. If you can have 5-10 high quality sites that provide backlinks to your new sites they seem to stay viable in the rankings and not get sandboxed.

  7. As always, Aaron, Tedster, and Jim have this thing nailed. Great post, Jim.

    Tell me this, though: where does Google LocalRank play into all of this? I’m assuming it’s the part that says, “Ranking search results by reranking the results based on local inter-connectivity”.

  8. By the way, if anyone’s interested, I found this calculation, which is part of LocalRank:

    NewScore(x) = (a+LocalScore(x)/MaxLS)(b+OldScore(x)/MaxOS)

  9. Great Article – I rarely find a blog post I can read the whole way through without missing a beat. I have to say that this whole re-ranking stuff going on, is beating me up on the phones mostly. I get calls from clients with “omg I’m not number 2 anymore” or “I can’t find my page anywhere on google”. After the Last 2 months of refreshes, my clients now understand and simply send me a polite email “is google dancing again?”, believe it or not, it’s much easier to call a refresh a small dance kinda like the hula – when I speak to my executive clients – they just want one word and nothing more – of course I sit and pull my hair out because I just don’t know where we’ll be after the refresh. Eventually we settle into our usual #2 or #3.

    Today, I’m on a different datacenter than the midwest and east coast, so I’m seeing my sites way up at #1. After reading this blog, I now completely understand what’s happening and I’ll just have to do my own dance come those client emails tomorrow.

  10. nice post. 🙂

    about supplemental Hell..
    ~Is it ok to make a post that only contains links to other pages of your blog?
    ~Will social bookmarks help?

    thanks.

  11. Hi we seen many sites Online on SEO stuff. but urs is Bit Uniq and U r posting near to Real concept of SEO.
    Thanks Keep it up.
    I say u r SEO GURUS….

  12. At Google, I see one more server at the end of the line that acts like that gopher smash arcade game. It tries to keep down sites that are always popping up through the many holes left by the system.

  13. I am being tossed and turned in a Google dance. I see you mention a few rules that I will take heed and try and implement to see if I can stabilize myself and get back on the path to glory.

Comments are closed.

Categories
Archives

Meet The Bloggers

Jim Boykin
Jim Boykin

Founder and CEO

Ann Smarty
Ann Smarty

Community & Branding Manager