When ever there’s a big Google update, like the Over Optimization / Webspam update that going on now, I tent to head over to webmasterworld and read what SEO’s are saying…yea, a lot of time it’s noise…but it’s the gems that keep me focused on reading through everything..and tonight I found a few gems there.
I just went though about 120 comments from the new thread there on the Google called “Google Launches Update Targeting Spam… Again?”
I found 6 comments worth rementioning…I know that this thread on webmasterworld will grow and grow over the days, so if you’re seeking to hear what other webmasters are saying about this google webspam update, that’s where I’d be hanging out.
Out of the first 120 comments, my 6 favorite are these….and my favorite is #1 by Robert Charlton….Robert always has great things to say and I highly value his ideas.
1. Robert Charlton
To conjecture about the “Again” part of the title… for me, anyway, these ongoing narrowly-targeted updates confirm more than ever that AI (Artificial Intelligence) is actively being used in the algorithm.
Why do I say that?… because, when using AI, it’s important for engineers to correlate what they’re targeting with the effects they see. Thus, I’m assuming, the updates are grouped together in chunks reflecting types of sites or situations Google is going after… with a degree of focus necessary to isolate cause and effect.
With regard to large collateral damage apparently being reported, I can only guess that Google initially needs to push things a little too far, not only to bring all the junk to the surface, but also to observe what kind of collateral damage is happening so they can fix it in a timely fashion. Otherwise, I’d assume, there would be a greater risk of longer term collateral damage that would go unobserved.
I’m assuming we’ll see pendulum-like swings, as with any feedback system, as corrective actions are taken and the algo zeroes in on what it’s targeting.
Conceivably also… still guessing… the value of upstream inbound links to some sites might be affected, which would affect downstream sites. In some cases, you might have inbound links from sites that are taken out, and it may not be your fault. Whether that’s going to be calibrated to protect the innocent isn’t clear. I would expect that Google certainly has thought about these considerations.
I believe the real problem is that the competition can now take you out by buying a couple $10 link spam packages that include thousands of blog comments and bookmarking sites all pointing to your url and Google will drop you for gaming their algo. This has happened to a few of our sites now and it sucks. In the past Google ignored such links but now you get the un-natural link notice and huge drop. I don’t want my site anywhere near the first page of Google just to avoid the onslaught until Google stops this nonsense.
My advice: wait it out. Let the results calm down then see whats where. Google are probably doing mini tests constantly, adding/removing things for the next couple of days and seeing what is removed and is not removed to then fix any problems with it. Wait the “next few days” and then some then come back and show your findings.
I feel like there’s a lot of crossover with the synonym updates here. Just an idea, though.
It’s actually nothing to do with it being blogspot. That particular blog was very popular and very famous till G got annoyed with it’s owner and took it away from him in 2010.
Here’s the story:
I believe some other person then nabbed the domain but didn’t do anything with it, not understanding just what they had picked up…
Anomalies like this are great for what they reveal about G. Clearly this update has nothing to do with on-page issues at all and is all about backlinks. Some backlinks have been discarded because they are “webspam” but others have survived, even though they too are webspam.
This gets tossed out after every update. its mostly said by webmasters who do not want to admit the faults in their website businesses so they write it off to google doing this for its own gain.
Here is what I think. The web is getting really big. The web grows exponentially each and every day. That is a ton of webpages. Google has made efforts that it has bragged about in spidering/indexing this entire web world with such speed and efficiency. Google is policing all of these sites and not to mention using all these resources in doing so.
If I was google, I would absolutely aim to remove the trash and shrink the web so that it is easier to rank websites and form algorithms that are more accurate.
That is their ultimate goal, if its not that I am not going to speculate on something I have no ground to stand on.
I have a lot of questions about this update. I want to know if they will be removing some sites they see as spam and only purpose is to provide backlinks to other sites. This is what I think they should be doing. Or are they just putting a penalty on all sites they see as spammy? If a website is spam, why not remove it from your index completely?
…it’s very early on in that thread, but it’s one to keep an eye one if you’re interested in the latest update by google.