Happy New Year and welcome back! We took a little break from writing these posts, but our communities were still churning out great threads!
On Threadwatch, we have news about Google throttling traffic and the death of the Link Search Operator.
Webmasterworld members are discussing Facebook’s journalist outreach program and keywords, while SEO Chat members are all about duplicate content.
And have you heard the news about AMP Pages yet? No? Well, Cre8asiteforums can give you the scoop! Let’s dig in.
In a bid to expand its audience and (probably) to capture more ad dollars, Facebook has created a “Journalism Project.” WebmasterWorld’s admin, engine, explains that it
“…will be a collaboration effort to develop new products, to explore new storytelling formats, how to build local news, and what it calls emerging business models.”
There will be hackathons and e-learning courses, too. As big as Facebook is, the Journalism Project could make it even bigger. WebmasterWorld member incrediBILL writes that this is another effort by Facebook to be your one and only provider of content. He adds that
“It’s all about advertising and there’s only so many ad dollars to go around for the same sets of eyeballs…”
Tangor muses that Facebook could be barking up the wrong tree, though. “[Facebook] just might have outsiders regulating their business! (Media and government among other things).”
AMP pages are Google’s pride and joy these days. But some webmasters find them to be a real pain in the rear. For one thing, AMP pages still use a different URL structure. They often include google.com or /amp, which can be confusing to readers. Kim explains on Cre8asiteforums:
“It’s a UX trust heuristic, similar to links that open up new windows. Any change is perceived as potential confusion or fear that the user is being taken off the main site with no way back. This behavior shows up with experienced users as well as inexperienced.”
But there are more problems with AMP than just URL structure. Grumpus explains that AMP doesn’t seem like the future to everyone. To some, it feels like a step backwards. In the old days, news providers would write two different versions of their content: one for mobile readers and one for desktop.
The mobile version would be “watered-down” to improve loading times. When these news providers switched to responsive sites, they were able to provide the same content to everyone. But AMP’s structure encourages a return to the old ways of watered-down content.
“Ultimately, AMP is nothing more than a power grab using a Cloward and Piven strategy.”
This is a must-read!
SEO newbies are terrified of duplicate content. Despite Google’s own guidelines stating that some forms of duplicate content are acceptable, and that only malicious forms are truly dangerous, the subject is filled with paranoia. SEO Chat members ThomasHarvey and Chedders set up a test to investigate.
They created a site using nothing but copied content. Their traffic is growing, they rank well, and they haven’t been penalized yet. Their results seem to indicate that Google has a sense of context. Chedders writes,
“I do think intent comes into play a lot here. Google do spot spammy scrapper sites and have become clever in understand the difference between the 2.”
If you can provide value to your readers and aren’t copying content to manipulate your search rankings, you may be safer than you think. It’s a thin line to walk – this is an advanced thread with a lot of nuance. Don’t go out and start scraping popular sites under the impression that you won’t be hammered!
Everyone knows that the keywords meta tag has been outdated for years. But there is still some debate about how important keyword density is, or whether keyword stuffing still works. WebmasterWorld member goodroi writes that keyword density is old news, but
“I am not saying you should create content that never uses the keyword the searcher type in. Rather you should supplement the keyword with relevant synonyms and concepts as you provide a more comprehensive answer with significant value.”
In other words, it’s more important to sound natural than it is to mention you keyword five times. Forum member RedBar disagrees:
“…the fact is I see every day almost identical pages from competing manufacturers within my industry stuffed full of keywords and nothing else of any value. Most of these are generated in India and seem to be impervious to any sort of penalty when they are clearly simply spamming the SERPs.”
Cre8asiteforums guru iamlost pops in as well with a great post, beginning “Keywords have been dead for years.” What do you think?
Another Google tool has been killed, it seems. Bill Hartzer wrote a post about it on Threadwatch. He explains that the link search operator hasn’t been working for years, but there was also no official word that it had been killed.
That official word just came down – so anyone who tells you to use the link search operator is now officially wrong. There are other ways to check your links, though! Read about a few of them in this post.
Webmasters have, for years, said that Google throttles traffic. In other words, once a site gets X amount of traffic it seems to stop entirely. There are two camps to this argument. The first camp says that there is no conceivable reason for Google to do such a thing. It simply doesn’t make sense.
The second camp believes there are plenty of reasons – controlling AdWords money and favoring big business among them. It’s a contentious debate and this post makes for a good introduction. Of course, Google says they don’t throttle – but can they be believed?
Sometimes you don’t want Google to find and rank you. Sometimes you want to disappear. That’s the case for this SEO Chat member. A site they oversee is owned by a partnership which is dissolving.
To make things fair, the partners would like to either split the link juice and authority or sink the site entirely. It’s a time-sensitive issue, which prompts the question above – how long would it take to vanish? Or is there a better solution?