This week in the communities, our users are buzzing about recent changes and working hard to analyze the many updates and shifts from the last few months. It’s busy, but also exciting, to have so much happening around us!
Definitely a welcome change from the cliché of a “slow summer.” Take a look at some of these great discussions from the past week:
On Threadwatch, Ann Smarty pointed out that Rand Fishkin has done an interesting experiment with click-throughs on Google. It seems that click-through manipulation returns fast results – faster than you might think. 70 minutes and 400 to 500 interactions were enough to push one link above another in SERPs…and this after Gary Illyes supposedly said that CTR wasn’t a ranking signal at SMX. Hmmm…
Another Threadwatch find, again from the fine folks at Moz. In this latest study, they took a sampling of data from their biannual ranking correlation study.
The conclusion they reached after digging through the top 50 Google search results for about 15,000 keywords was that “out of the top results, a full 99.2% of all websites had at least one external link…In other words, if you’re looking for a site that ranks well with no external links, be prepared to look for a very long time.”
Google’s latest update is called “The Quality Update,” and it has a lot of people spinning their wheels trying to figure out what’s great and what’s poor quality content in Google’s eyes. “Google demonstrably can not detect the quality of content,” says Cre8asiteforums user iamlost in counter to this.
They continue, “if it could there would not be blank pages as the ‘best’ query result, nor mis-mashed shotgun type keyword combinations, nor…And the sooner all webdevs and marketing types realise that Google can NOT tell whether content is readable, sensible, or ‘quality’ the better for all the angst and snake oil out there.” What do you think? If Google can’t determine “quality content,” then what CAN it determine? And what does that mean for you as a web designer, an SEO, or a marketer?
Once upon a time, before Facebook, they were giants of their industry. Myspace is still around as a shell of its former self, but friendster has been inactive for some time. What happened? Cre8asiteforums user Grumpus points out exclusivity as one possible factor:
“Facebook’s massive success started… by the fact that no one could join it…you needed an invitation. It’s a good plan – create something, brand it as cool, but then don’t let anyone see it. All that does is build desire.” Did Myspace and friendster fail because they came up short, or because Facebook took off like a rocket? Join us around the mortician’s table as we dissect these intriguing case.
Earlier in June, around the 17th, many were startled by a sudden upswing in activity in Google SERPs. At first many users of WebmasterWorld thought that it could be Panda, but as time went on it was announced as a “core update.”
Barry Schwartz of Search Engine Land also reported that “I suspect the reason so many tools showed a spike this week was related to…Wikipedia changing all its URLs to go HTTPS this week..” This thread from WebmasterWorld got many of our users scratching their heads – was the update anything significant, or just the usual “everflux?”
And when you’re done with that thread, move on to this one for some more juicy details! According to a report by Marcus Tober of SearchMetrics, writes WebmasterWorld user Robert Charlton, “…the main aspects of the algo change appear to be QDF (Query Deserves Freshness)… in fact extreme QDF … constant updating of serps, which is benefiting mostly well-known news sites…”
User MrSavage shakes his head in disappointment and says, “To me it could be viewed as very discouraging to the average webmaster. If you haven’t earned your ‘colors,’ you aren’t part of the chosen ones…I don’t see how a small niche site cracks through that.” Other users in this thread are reporting that Google is able to update event times and schedules for high ranking items within minutes. What changes have you observed since June 17?
According to WebmasterWorld admin engine, Google Trends has received the biggest update “since 2012” recently. “Not only are Trends based on 100 billion searches per month, the data also includes News and Youtube. Importantly, it also goes into great detail for in-depth research.” Furthermore, says user Robert Charlton, “it’s assumed that the new Trends, along with Twitter, are the driving forces” behind the QDF queries that are an important part of the June 17 “News-Wave” update.
You can read about 301 redirects almost everywhere you go, but “it seems to be different everywhere I read it,” writes SEO Chat mod Test-ok. “Let’s see if we can get to the nuts and bolts on this.” What do 301 redirects do, and what are they incapable of? When should they be used, or not? Can they, like the flu spread between people, transfer penalties from one website to another? Senior member Fathom has an excellent breakdown of Penguin-affected domains and 301s. This topic promises to be a great conversation.
In this thread, a user has submitted a sitemap index for their very large website (400,000+ URLs) and is dismayed to find that only 7,000 of them have been indexed in Google after two months. Is it the size of the sitemap index that’s doing it, they wonder? Would it be more beneficial to break it into smaller pieces.
“A sitemap aids Googlebot’s ability to quickly find pages but certainly doesn’t help it rank them,” writes user Fathom. “…if a page would be ranked below position 1,000 there isn’t much point indexing them in a hurry.” But besides links and ranking signals, there are also underlying issues with the website’s design that may be hampering its ability to be indexed swiftly. Twists and turns abound in this excellent thread for learners!