Perhaps the biggest piece of news this week is that Google’s mobile-first index has awoken. The project is starting small, Google says. Over time it will grow until the whole web is covered.
At that time, and moving forward, the mobile index will be of significant importance to webmasters. Our community members are busy dissecting and discussing this and other pieces of news.
We have posts about Progressive Web Apps, strange behavior in Google’s mobile SERPs, web security, and Facebook analytics. Take a look at this week’s roundup and join us on one of our forums today!
Google calls their first steps “a small experiment.” There’s no reason to panic and you haven’t missed the boat. But it is probably worth investigating if a mobile or responsive website is worth your time and effort at this point. If you don’t have a mobile site yet, Google has said
“…we’ll continue to index your desktop site just fine, even if we’re using a mobile user agent to view your site.”
The members of WebmasterWorld are curious to know what changes are coming and how Google will parse mobile sites. How does Google interpret the way visitors consume website content on mobile devices? Forum member Tangor has some regrets about the design of mobile sites, writing
“My only regret is that mobile pages/layouts are so cookie cutter that it will become increasingly difficult for webmasters to craft a unique look and feel for their offerings.”
The mobile web certainly has a long way to go before it catches up with the kaleidescope of diversity on desktop.
Google describes PWAs as having four qualities: they “have the reach of the web”; they are reliable and quick to load; they are fast and responsive; they are “engaging,” and create “an immersive user experience.”
Sounds like the kind of thing that would be Google’s dream app or website in the future mobile-first web. Google recognizes that creating these snazzy PWAs is not always easy.
Sometimes the fundamentals of website creation are forgotten in the scramble for the future. That’s why they’ve published a succinct and comprehensive guide to making sure your PWA is properly indexed. WebmasterWorld member ergophobe praises it and says that it’s a good, long, piece of reading material:
“Some excellent resources there if you follow the links down the rabbit hole…too busy for the next couple of months, but after that? It looks really easy to make some simple PWAs.”
SEO Chat member Chedders spotted something strange on page 2 of Google’s mobile SERPs.
“When I clicked to the 2nd page I was greeted with a load of listings showing images extracted from the content. At first I thought maybe it’s the open graph being shown but looking at the source for a few of these results it just appears to be random images on the page.”
Have you seen anything like that in the SERPs before? How does Google figure out what images to pull? Why pull images at all, and why only for the second page of SERPs?
Forum member smartyank says they saw this earlier in the year, around the 28th of September, in Indian SERPs. There, they were first page results.
“Unfortunately, I have no idea of this update or test.”
Join us and make a post if you have any ideas!
In just a couple of months, Google Chrome will start marking any HTTP pages with password or credit card fields as “insecure.” You can get the details in this Threadwatch post.
Basically, it will impact pages and not full sites. If you have HTTP pages that don’t have credit card or password fields, for example, they will most likely not be marked insecure.
Google has been talking about taking this kind of drastic action for years – and now the chickens are coming home to roost.
Sometimes Google just gets tired of spammers and their shenanigans. A few months ago it was reported that Manual Actions would be harder to lift for repeat offenders. Now a similar policy is being enacted for sites that are marked as dangerous to users. Google defines a repeat offender as
“…websites that repeatedly switch between compliant and policy-violating behavior for the purpose of having a successful review and having warnings removed.”
These kinds of sites will be unable to request a review for 30 days and the warning will stay up for that entire period. Get the details on WebmasterWorld.
As our community members have written in the past, Facebook’s metrics can be difficult to parse. In this Cre8asiteforums thread, forum member glyn has asked for analytical reading
“…about the FB reach calculation and what it in fact means.”
Fellow forum members iamlost and earlpearl have some great tidbits of advice. Turns out, “reach” is not easy to understand:
“It is deliberately a black box,”
“The FB API is very specifically set to return potential Reach and NOT daily, which is ONLY through their ad ‘campaign pricing’ interface.”
Earlpearl laments that Facebook has
“purposefully crushed Reach. We typically see reach totals that range between 5 – 20% of what we saw before they purposefully started to reduce it. I perceive it as a weak metric.”
If you use the Web of Trust browser plug-in, you’ll want to read this thread.
The plug-in was discovered selling user data. Initially they said that the data was anonymous, but it turns out that it’s quite easily identifiable. It was a useful tool for some webmasters, but many of them are reevaluating their support of it in the wake of this news.
WebmasterWorld member topr8 writes
“…quite honestly, if something is ‘free’ what do you expect…I’m always suspicious of what might happen to the data on a free service. In the case of a browser plug-in, they get a lot of data!”